It’s been a rough decade for the concept of trust. Faith in major institutions is at an all-time low. The tech industry has, if anything, streamlined its ability to burn people out. And to assume your employer has your best interests at heart often feels like a quaint notion from a bygone age.
But trust is useful! When we put trust in ourselves, we spend less time mired in self-doubt and more on personal growth. When we extend trust to our teammates, we accomplish more together than alone. And when our organizations earn our trust, aligning on a shared vision can transform how we work.
So that’s why I wrote this talk. Consider it a reflection on how we—as individuals, teams, and organizations—can recover from the (justifiable!) pervasive distrust in our industry by reframing trust as an intrinsic orientation instead of an extrinsic commitment. In it, I share six stories from my career as a software developer to illustrate how trustful and distrustful orientations can each create reinforcing loops towards increasingly vicious and virtuous outcomes, respectively.
This presentation was recorded as the keynote address for the inaugural Reliable Web Summit.
If this talk resonates with you, please share it with your friends and colleagues! In addition to software development, Test Double helps its clients improve how they work as a team—if your team is working to improve, we’d love to talk to you. And if you’re interested in helping other teams foster high-trust relationships, consider taking a look at our open positions.
[00:00] (upbeat music)
[00:03] - The title of this presentation is how to trust again.
[00:07] And you know what, as soon as I typed that,
[00:09] I knew that I might be a little over ambitious.
[00:11] In fact, the last time I gave a how to talk,
[00:14] it was titled how to program.
[00:16] And so maybe I'm just inclined to over promise
[00:19] and under deliver.
[00:20] A more honest title might be how to take a moment to pause
[00:24] and reflect on your relationship with trust as it pertains
[00:26] to software development through six personal anecdotes
[00:28] that may or may not resonate with you.
[00:31] Yeah.
[00:32] So anyway, we're talking about trust today and,
[00:36] I think that it's only fair to say that
[00:37] when you buy a product, you assume that you can trust it.
[00:40] So it's natural to trust software by default too right?
[00:44] And that would be true.
[00:46] But if you know me, if you follow me on Twitter,
[00:48] you've probably seen me just tweet the word neat
[00:51] accompanied with a screenshot of software
[00:54] and hardware, failing me in interesting and surprising ways.
[00:57] And I can't speak for you, but at least for me,
[00:59] I've come to distrust software by default
[01:02] over the course of my career.
[01:03] And a lot of my work has been to try to like make it better.
[01:07] In fact,
[01:08] almost 10 years ago now I co-founded a software consultancy
[01:11] called Test Double.
[01:12] And that was the first half of our mission.
[01:14] That software is broken.
[01:15] And then the second half being
[01:16] that we're here to make it better.
[01:19] So if you are either a company
[01:21] that needs more senior engineers
[01:22] and you could use some additional help,
[01:24] or if you're looking for a new opportunity
[01:26] and you're interested in consulting,
[01:27] I hope you'll check us out.
[01:29] My name is Justin Searls and you can reach me here
[01:31] or find me on Twitter, GitHub, LinkedIn with my last name.
[01:35] Searls as my handle.
[01:37] And I hope we'll be able to connect.
[01:40] Now, it's conference season
[01:41] and so that means people are excited about new tools
[01:44] and practices and ways of doing stuff.
[01:46] And it's true.
[01:46] There's a lot to be excited about,
[01:47] but maybe I've just been at this too long,
[01:50] but I've got this nagging question about tools, which is,
[01:52] why is it that one person who could have one problem
[01:56] prescribe a particular solution and have a great outcome,
[01:59] but another person with the same problem
[02:01] and prescribing the same solution could have
[02:03] the exact opposite negative outcome.
[02:06] And so, these two people might run into the hallway track
[02:10] at a conference and say,
[02:10] Hey, you know, like Searls advocate for this thing.
[02:14] And the other person say, no, it's a hoax.
[02:16] It's a total waste of time.
[02:17] And you know, the weird thing about this is that
[02:19] they're both kinda right,
[02:21] they're speaking to their own valid experiences.
[02:23] But like you wouldn't expect this to happen
[02:25] as often as it does in software
[02:27] where we just have people line up on either side and say,
[02:29] one tool is awesome and the same tool's terrible.
[02:32] And it keeps fueling the fires of us reaching for novelty
[02:35] and new tools and process.
[02:37] So to think about this more generally,
[02:39] let's switch to a radically different context and imagine
[02:42] that there was an avocado conference.
[02:44] And you could imagine a bunch of tools there for slicing
[02:47] and storing and keeping avocados, or,
[02:50] preserving them in different ways
[02:52] and all sorts of creative avocado innovations,
[02:54] like how to grow one out of a seed like this.
[02:57] And yeah, that would be really cool.
[02:59] And people I'm sure would be excited because tools are good.
[03:02] Like, for example,
[03:03] if you've ever had the problem with an avocado
[03:04] turning brown, you could use a tool to keep it fresh longer,
[03:07] and then it stays green and there you fixed it.
[03:10] But tools are good most of the time,
[03:13] but they can have unintended consequences.
[03:15] So for example,
[03:16] if you've got a brown avocado
[03:18] and then you decide to use some avocado colored markers
[03:20] on it to green it up again, sure you fixed it,
[03:23] the avocado isn't brown anymore,
[03:25] but have you really solved the underlying problem?
[03:27] And I feel like when it comes to software,
[03:29] when you have a bad experience or bad outcome,
[03:31] we tend to jump to the conclusion
[03:33] that the solution is more software or better tooling.
[03:36] Whereas, maybe the bad software is just a symptom
[03:39] and we're skipping right over a root cause analysis,
[03:42] which leads us to a solution to just assume
[03:44] that we need more software when we really don't.
[03:47] Go back to that brown avocado for a second
[03:49] and imagine you color it in, you make it green,
[03:51] you hand it to somebody,
[03:52] it's not gonna be particularly appealing, but like,
[03:55] if you try to solve that by just putting
[03:56] some guacamole seasoning on it,
[03:58] now it's just gonna be covered in little dust flakes.
[04:00] And, you might try to solve that then by putting something,
[04:04] artificial avocado flavoring on to kinda restore
[04:06] the original flavor.
[04:07] And at this point it's disgusting, right?
[04:09] Like it's obviously not the solution,
[04:12] but had this been a software conference,
[04:13] maybe people's reaction be like,
[04:15] we need better avocado tooling
[04:17] instead of questioning the premise.
[04:19] So whenever we find ourselves treating the symptom
[04:21] in this industry, it's worth interrogating,
[04:23] whether we're just creating new problems
[04:25] and distracting ourselves from the root cause.
[04:29] I'm using the word trust a lot.
[04:30] And so it's probably worth clarifying what I mean by that.
[04:33] Sometimes people use trust to mean like hope
[04:35] or a commitment or a promise
[04:37] or some sort of reciprocal agreement to rely on one another.
[04:40] And really today I'm just talking about it
[04:42] as a default reaction, an orientation
[04:44] that we have to, novel stimuli.
[04:47] So like,
[04:48] if you imagine trust as a compass
[04:49] where north is the direction of trust.
[04:52] When I'm personally operating from a place of trust,
[04:54] I tend to assume that other people are competent
[04:56] and that their ideas are valid
[04:58] and that new things are worth giving a try.
[05:00] But when I react out of distrust,
[05:02] I doubt the competence of others and,
[05:04] with it I react negatively to their ideas and to their work.
[05:08] And so just like a compass,
[05:09] that orientation determines where we're going.
[05:12] And I think that one of the reasons tools gets conflated
[05:15] in here is because whatever tools we use,
[05:17] are just gonna get us going
[05:19] where we're already headed faster.
[05:21] So, as you go in a direction like say with distrust,
[05:26] it tends to compound the further you get
[05:28] sometimes in vicious cycle,
[05:30] sometimes just in second order effects,
[05:32] but the same is true for trust.
[05:33] And so the same exact tools
[05:35] can be used in a trustful orientation.
[05:37] And those vicious cycles could just as well be
[05:39] virtuous cycles as we build out and reinforce
[05:43] and have a larger positive impact.
[05:45] And so today I'm just gonna share a few stories
[05:47] from my own career about experiences
[05:49] with both trust and distrust.
[05:51] And let's kick things off by talking about
[05:53] the kernel of distrust,
[05:55] because it winds back to me as an individual
[05:59] at the beginning of my career.
[06:01] And so at the time I really doubted myself
[06:04] at my first programming job.
[06:05] I walked in and it was like Charlie Brown's teacher
[06:08] with just like blabbing 40 proper nouns.
[06:10] I'd never heard before.
[06:11] And people spouting aggressive deadlines
[06:14] and talking about how failure wasn't an option.
[06:16] It was really intimidating.
[06:18] Now we were a Java team and we're using this IDE
[06:21] or integrated development environment called eclipse.
[06:24] It looked something like this.
[06:25] And if you've ever used an IDE,
[06:26] a funny thing about them is that like you have to dig
[06:29] and look very, very like just itty bitty code there.
[06:31] That's where the actual code is
[06:33] because most of the user interface Chrome
[06:35] is actually tied up with all of these supporting tools.
[06:38] They help you navigate or edit the thing or refactor stuff
[06:42] in an automated way as opposed to spending
[06:44] as much time writing code.
[06:46] And they are indeed very Neato.
[06:47] In fact, only like a few weeks later
[06:49] because I was so self-conscious
[06:52] about how I didn't really know how to write code well.
[06:54] Eclipse became my security blanket.
[06:56] I was able to kind of grow up around the environment
[07:00] and find a way to be quasi productive
[07:02] by just right clicking on stuff
[07:04] and discovering new user interface elements
[07:06] that could do stuff for me.
[07:07] So I was coming from this place of feeling lost
[07:09] and overwhelmed and not confident.
[07:11] And the first thing that I found was the IDE
[07:14] could help me navigate around in the code.
[07:16] And so looking at a function here,
[07:17] if I wanted to know what this particular steam method did,
[07:20] I could just control click on it and then I'd be there.
[07:23] And I could see the code listing
[07:25] and start working from there.
[07:28] But what I found was,
[07:28] I wasn't building real good mental model of the system.
[07:31] So somebody had asked me about that particular feature
[07:33] and what was responsible for it.
[07:34] I wouldn't really remember and that was concerning.
[07:37] But as time went on, it is a couple of months now
[07:40] and I didn't know where anything was.
[07:41] And so I came to rely on another feature.
[07:43] It was auto-complete and fuzzy finders in the system.
[07:45] And so here,
[07:46] if I'm writing a function to validate addresses,
[07:48] I could start writing form
[07:50] and then hit my auto-complete button
[07:52] and then get a whole bunch of things that start with form.
[07:54] And then I pick, okay, well I wanna format address.
[07:56] And so I pull that method in.
[07:58] And so you'd see the top of every file,
[08:00] just start to import all of the stuff
[08:02] that you'd auto completed in.
[08:03] And they could just be methods from wherever,
[08:04] but sometimes you'd end up reaching and grabbing
[08:07] like really silly stuff like production code
[08:09] that pulls in a test helper method,
[08:11] just because it looks like a thing that might be useful.
[08:13] And that's wildly inappropriate,
[08:14] but like it makes sense
[08:16] because systems are kinda just graphs, right?
[08:19] Of objects invoking methods.
[08:21] And those are the vectors in the graph.
[08:23] And they're easiest to conceptualize when our code is,
[08:26] doesn't have any cyclic dependencies
[08:28] and it's just things calling other things.
[08:31] And so here I am at the,
[08:33] writing some code at the bottom of this tree.
[08:35] If I'm reaching around to a whole bunch of other units
[08:39] throughout the system,
[08:40] I'm probably creating a lot of cyclic dependencies
[08:43] and it could be having the effect of
[08:47] making the system harder for me to understand
[08:49] and harder for others to maintain.
[08:52] So that only compounded, right.
[08:54] So not only had I been like relying on a lot of
[08:56] automated tools to make things,
[08:57] now I didn't have any confidence at all.
[08:59] I could just like write code from scratch.
[09:01] And so I started relying on code generators
[09:03] and little wizards throughout.
[09:05] So here's like what a clips looks like with a blank page.
[09:08] And of course, like blank page syndrome sets in,
[09:11] and I just don't know how to write Java from scratch.
[09:13] But fortunately it has little wizards.
[09:15] Like I could create a web service or I could just fill out
[09:17] this form and get a Java class.
[09:18] And so I started doing that.
[09:20] Now, if you're conscientious
[09:22] and you're designing a system in a organized way,
[09:26] you might have a file structure that looks like this.
[09:28] But if you're just using a wizard,
[09:30] they generate it all for you,
[09:31] like no automated tool can be a substitute
[09:33] for thoughtful design and organization.
[09:35] And you just end up with sort of a dumping ground of files
[09:38] in one place.
[09:39] And it makes it harder for you to reason about the code
[09:42] and for other people to understand what's going on.
[09:45] So I personally spent several years treading water
[09:47] like this.
[09:48] The tools helped me cope,
[09:49] but they really just enabled me to put off learning
[09:51] how anything really worked.
[09:53] Now, like this sort of thing can spread, right?
[09:56] So like what happens when distrust spreads
[09:58] throughout an entire team's culture?
[10:00] And I joined a distressful team once,
[10:02] but even still, I had a really good first week.
[10:04] I was proud of what I'd done,
[10:05] but it was clear no one valued any of my progress
[10:07] because I didn't have any JIRA tickets file.
[10:10] So what had happened here was this team didn't believe
[10:14] that programmers could be trusted to manage their time
[10:16] and their activities.
[10:17] And so they required everyone to write those
[10:19] into a ticket tracking system and that they'd monitor it.
[10:22] You'd have a new ticket, you'd write a summary
[10:24] with an estimated and actual hours,
[10:26] and then you'd have to assign a reviewer
[10:28] to review it for you.
[10:30] And so what blew me away was that
[10:31] all these incredibly bright colleagues
[10:33] who are experts in their domain,
[10:35] we're just walking around all day, talking about how to gain
[10:37] this ticket tracking system appropriately.
[10:39] And, I caught myself doing it too.
[10:41] Maybe I wanna create a new service,
[10:43] but instead it's like more and morphous ambiguous,
[10:46] like where I can define success for myself,
[10:48] if I just call it refactoring.
[10:49] And I could estimate that it's 12 hours,
[10:51] but I know I'm supposed to hit 40.
[10:52] So I could just like pad the estimate.
[10:54] And if I go over time, I know I'm gonna get in trouble.
[10:56] So I'll just kind of like hide time and reduce my actuals
[11:00] to make sure I'm meeting my estimates.
[11:02] And instead of picking the reviewer
[11:03] who knows that part of the system best,
[11:05] instead I'm just gonna like find an ally
[11:07] who will just approve it for me.
[11:09] And this of course didn't lead to anything good.
[11:11] And instead what happened was the good quality suffered.
[11:14] And instead of like going back on this decision,
[11:16] what the team did instead was start to mandate
[11:18] code quality metrics in particular code coverage.
[11:21] So that's the percentage of lines
[11:22] that get executed whenever the test run.
[11:24] And it wasn't high to begin with,
[11:25] but they had a CI tool that would go red whenever
[11:28] like the code coverage dropped after a new commit
[11:31] and boom there, oh, it was me.
[11:34] I decreased the code coverage and what happened.
[11:36] I had a really long method.
[11:38] Somebody else had written,
[11:39] I added a couple of lines and because there were no tests
[11:41] of this method, it dropped.
[11:43] So what was the solution?
[11:44] While you could just write an empty test
[11:46] and then call that method and pass nonsensical arguments
[11:49] and not write any assertions.
[11:50] And then boom,
[11:51] as long as it doesn't have like raise an error,
[11:53] you got a lot of code coverage
[11:54] and you'd see the code coverage spike.
[11:56] And then everyone on the team would congratulate me
[11:57] on increasing the code coverage.
[11:59] Kinda silly.
[12:01] When this happened, everyone was like,
[12:03] kind of shell shock.
[12:04] They were afraid of writing code
[12:06] that wouldn't meet these metrics.
[12:07] And so productivity dropped off and the team solution was
[12:09] to start benchmarking individual velocity in terms of
[12:12] how many lines of code people were writing per month.
[12:15] So some people wrote a little, some wrote a lot.
[12:17] I actually sometimes deleted more code than I wrote,
[12:19] and that got me in trouble again.
[12:22] So I'd talk about like, well,
[12:23] is it my JavaScript quality metrics?
[12:25] Are they not good enough?
[12:25] And they say, oh no,
[12:26] we actually don't even track those
[12:28] 'cause we mostly do Ruby.
[12:29] And I was like, well,
[12:30] does my JavaScript still counts as progress?
[12:31] And they're like, oh yeah,
[12:32] it still counts towards your individual velocity.
[12:34] So we took this application, it was mostly Ruby,
[12:37] but then we gave developers this pathway by which,
[12:39] path of least resistance,
[12:40] they could just write as much JavaScript as they want
[12:42] without any judgment.
[12:44] So suddenly it became a very JavaScript heavy application.
[12:47] It was a totally unintended consequence.
[12:49] And so I wrote a lot of code and I got my star
[12:51] and I moved on.
[12:53] But when my colleagues didn't trust me to do great work,
[12:57] it really felt impossible for me to do great work.
[12:59] And it became a self fulfilling prophecy attempts to control
[13:02] how we created things,
[13:04] just distracted from the joy of creating,
[13:06] and it became to feel like just another job.
[13:09] How has this spread further?
[13:11] Right?
[13:12] Like what's the impact that distrust as a culture starts
[13:14] to take root in an organization.
[13:16] And so one time I had a client where we were in this really
[13:19] obvious case of just out of touch management.
[13:21] They didn't understand software.
[13:22] They didn't want to.
[13:23] And what they did know is that they wanted
[13:25] to move the process to lean because they heard
[13:28] that it would reduce waste.
[13:29] And so I was game, okay, show me what you've done so far.
[13:32] And their premise or thesis was that they were spending
[13:35] too much on programmers and their solution.
[13:37] Pay less for programmers.
[13:38] They looked at a board of years' experience
[13:40] and hourly rate.
[13:41] And they said Muda,
[13:42] that borrowing a Japanese word
[13:44] for the sake of calling it lean.
[13:45] And they added up the numbers, right?
[13:48] So 85 times seven people over 40 hours,
[13:50] that's a pretty high run rate
[13:51] and they weren't happy about that.
[13:52] And so what they did was they started laying people off
[13:55] and hiring people with less experience
[13:57] and for lower dollars per hour.
[13:59] And they succeeded in a sense, they got their like,
[14:02] blended rate down in terms of their loaded costs.
[14:05] And the seven people were slightly less happy,
[14:07] but they didn't account for though,
[14:08] of course it was like people with less experience
[14:10] and no institutional knowledge took twice as long
[14:12] to get anything done.
[14:13] And now their overall cost to get anything accomplished
[14:16] in the system went up.
[14:18] And so that didn't solve the problem.
[14:20] But instead of taking that as feedback,
[14:21] that that was a bad strategy, they just leaned in and said,
[14:24] well, the problem here is the programmers
[14:25] aren't spending enough time at their laptops coding.
[14:28] And so we need to manage them more strictly.
[14:30] And what I saw when I walked into these rooms, it was like,
[14:32] this is a really collaborative environment.
[14:34] People are getting along great,
[14:35] but what they saw was different.
[14:37] They wanted to improve it, right?
[14:39] So they borrow another Japanese word and they say,
[14:41] Hey, we need a product to be inclusive
[14:45] and included in all of these design decisions.
[14:47] And so they just kind of dropped a plant in the middle
[14:49] of the room and that person sort of shut down
[14:51] all these conversations,
[14:52] which had a silencing effect on everybody.
[14:54] And they indeed got what they wanted.
[14:56] Everyone was working at their desk all day,
[14:58] even though they were very disengaged.
[15:00] And, when, as you might expect,
[15:04] office morale kind of dropped off a cliff
[15:06] and no one was having fun, right?
[15:08] And so they decided to just move the developers
[15:10] to a new office, they're looking around and they're like,
[15:13] well, us business guys all have a lot of fun.
[15:15] And so they just said, Mura!
[15:17] Unevenness, like we should have concentrated offices
[15:20] based on everyone's function.
[15:21] And they moved those people out,
[15:23] literally down the street
[15:24] to like a kind of dilapidated warehouse facility.
[15:27] And it was like not fun,
[15:28] but to them it was just out of sight out of mind.
[15:32] It can be hard to even imagine how an organization can get
[15:35] this screwed up,
[15:36] but like we can of course relate to ourselves,
[15:40] not believing in our own abilities
[15:42] or in the abilities of our teammates.
[15:45] And I think what we often fail to do is make the connection
[15:47] that by the time distrust really takes root
[15:50] in an overall organization,
[15:51] there's just enough distance and indirection
[15:54] to systematize that distrust.
[15:56] And it can literally dehumanize people by just assuming
[15:59] the worst in them and codifying that into practices
[16:04] and policies.
[16:05] So that's enough about distress.
[16:07] Let's tell a few stories about the difference in life
[16:09] when you are in a trusting scenario.
[16:12] So first I wanna talk about just the first kernel,
[16:15] how trust benefits you as an individual.
[16:18] I had a client once and they were very, very smart.
[16:20] And so they took their system and broke it down for us.
[16:22] They said, well, you know, it's a large single tenant app.
[16:24] It participates in energy markets,
[16:25] controls millions of thermostats via a radio paging network.
[16:30] And it helps keep the grid stable when it's under peak load.
[16:33] And that was amazing.
[16:36] And then there is more, so your job is just expand that app
[16:39] from merely participating in markets
[16:41] to instead of establishing and running the market
[16:43] for this redacted nation.
[16:45] And at this point I just didn't know what to do.
[16:47] But I had enough experience at that point to know
[16:49] that even though this was more work than I could handle,
[16:52] all I gotta do is shrink the problem
[16:54] and narrow my focus on something that I can control.
[16:57] So I don't get overwhelmed.
[16:58] And so I can wrap my head around it.
[17:00] So there was this big existing rails application,
[17:02] and I was content to just say, that's a lot,
[17:05] but I'm just gonna set that aside
[17:07] and instead think about just this thin candy shell
[17:09] of new functionality that I'm gonna build
[17:11] that just take an electricity in dollars
[17:12] and does the math that they're asking me to do.
[17:15] Now, to get started though,
[17:17] I had to get some kind of traction.
[17:19] What was the first line of code that I'd write?
[17:21] And the answer to that is I wanted to create
[17:22] the simplest possible thing I could,
[17:24] I was looking for just getting some kind of toehold
[17:27] writing something, even if it's trivial that I could use
[17:30] to make incremental progress.
[17:32] And so one way to do this,
[17:33] might've been to introduce a browser
[17:34] and start doing web UI stuff
[17:36] and maybe render a calculate button, hit a button,
[17:38] and then like see it go through the whole system
[17:41] and arrive at some outcome.
[17:43] That would have worked, I think,
[17:44] but it still would have been dredged up
[17:46] all of this complexity.
[17:47] I wasn't ready to digest quite yet.
[17:50] And in addition to add all of this UI
[17:53] and browser complexity as well,
[17:55] it would just be overwhelming.
[17:56] So instead what I did,
[17:57] was I just pushed away the larger application
[18:00] for a second and treated this new functionality
[18:02] as if it was a self-contained library.
[18:04] And I started writing a failing test
[18:06] and then making that test pass.
[18:08] What we were doing was practicing test driven development.
[18:10] And TDD as a practice over my career really helped me
[18:14] break down problems when I was at lowest confidence
[18:17] and find a productive rhythm,
[18:18] even when I'm starting something completely from scratch.
[18:21] So here you just start with a basic test case,
[18:24] just instantiate a new thing, call it with a basic argument.
[18:28] And then, if you call it with zero assert
[18:30] that you get zero back, something real simple.
[18:32] And the name of the game really
[18:34] is with each action you take when you have a failing test,
[18:36] either make that test pass or change the message.
[18:40] Now the first message you get is that
[18:41] that class doesn't exist.
[18:42] And so you just go in and define the class
[18:44] and then it'll say, oh, well that method doesn't exist
[18:46] and you define the method.
[18:47] Let's say, oh, well that doesn't have this argument
[18:49] and you just add a parameter.
[18:51] And finally you get the actual assertion error saying, Hey,
[18:53] I expected this to be zero, but it was nil.
[18:56] And like, these are very, very tiny steps,
[18:58] but then you make a pass by adding a zero
[19:00] and it really starts to give you this sense of
[19:04] forward momentum as you go.
[19:07] And boom, there you go.
[19:08] We got to our passing test, a dot means passing
[19:10] in this test runner.
[19:13] And so over time, right?
[19:14] I just kept adding more dots.
[19:15] And each of those dots represented a feature or an edge case
[19:19] or an if then,
[19:20] and what it was able to do was give me the confidence
[19:23] that I had built something of value,
[19:25] 'cause this was all to the specifications
[19:26] that they'd provided me.
[19:27] And so once I had that traction,
[19:30] I just had the confidence now to prove that it worked
[19:33] by plugging it into the bigger system finally.
[19:37] So I had this running test of just this
[19:39] in this isolated case.
[19:40] And now I was gonna pull in the bigger app
[19:42] and write a failing test that just called through my thing
[19:45] and everyone else's stuff
[19:46] and ultimately get some kind of answer.
[19:48] And so to do that,
[19:49] I just wrote an empty test case
[19:50] that just did the bare minimum of tests set up
[19:52] so that it wouldn't blow up when I was done.
[19:55] So I ran the test and a lot of time passed.
[19:58] And it did pass, it didn't blow up.
[20:00] But still didn't look right
[20:01] because it took like four minutes.
[20:03] And I've learned over the years, that is a really important
[20:06] to get fast feedback from my system, right?
[20:09] Thinking about this, there's only 480 minutes in a Workday.
[20:13] And if it takes me four minutes to run a single test
[20:16] that's empty, that doesn't do anything else.
[20:18] Then I only can run it 120 times at maximum.
[20:21] And that doesn't count thinking
[20:22] and that doesn't count typing.
[20:23] So like it really limits the number of feedback loops
[20:26] that I can have every day or in other words,
[20:29] like the number of questions
[20:30] that I can ask my system every day.
[20:32] Now, the more you can ask of your computer
[20:35] and the faster, and the richer feedback you get back,
[20:37] the more you're able to build your confidence.
[20:39] So this is super important.
[20:41] So anyway, I had to make it faster.
[20:42] So I pulled out a profiler and I finally,
[20:45] as I'm looking at how this code is getting invoked,
[20:48] I have the confidence to pull back the covers a little bit
[20:50] and start to dig into the individual points of slowness
[20:54] throughout the system and optimize them and make them faster
[20:56] until I get everything not only passing
[20:58] but much more performance.
[21:00] And this was a way to take the confidence I've built
[21:02] in myself and earn trust with the rest of the team.
[21:07] So it had a compounding effect.
[21:08] Now, again, trusting yourself in this industry is not easy.
[21:12] The two things that helped me here were like,
[21:13] breaking down problems into smaller digestible bits
[21:16] and adjusting the environment around me to get richer,
[21:19] faster feedback from the computer.
[21:20] What works for you might be totally different.
[21:23] But taking it a step further and talking about
[21:25] how trust extends,
[21:27] I'd like to talk about how it benefits teams.
[21:29] Now I was on one team about 10 years ago now,
[21:31] that was the best team I've ever been on.
[21:33] It was called Course Reader
[21:34] and none of us had ever worked together before.
[21:36] And so the first thing we had to do
[21:38] is normalize on approach.
[21:39] And the first step of that is just to form as a team.
[21:41] So we're in this cubicle firm and we decided to tear all
[21:44] those walls down and instead come together
[21:46] and form a team room around us.
[21:49] Once we had, we stormed,
[21:51] we like talked about all the different tools
[21:53] that we would like to use and different approaches
[21:55] and architectures.
[21:56] And it went on for like a week and a half or two weeks.
[21:59] And the business came along and they're like,
[22:00] these people are not programming, that's a problem,
[22:02] but we said, don't worry,
[22:03] it's gonna work out.
[22:04] And sure enough, over not much longer,
[22:07] we started to compromise on the tools that we would use.
[22:10] And we all kind of came to be of one mind
[22:12] about our approach.
[22:13] And in order that we'd be productive,
[22:15] it was important for us to kind of codify
[22:16] these norms upfront and commit to them.
[22:18] And so we said, all right,
[22:19] so let's take the conventions that we can agree to
[22:21] and get started.
[22:22] And so we literally printed out this covenant on
[22:24] like a plot printer.
[22:25] We said, okay, we've agreed,
[22:27] we're gonna do a hundred percent test driven development.
[22:29] We're gonna pair a program all day every day.
[22:31] The second pair, when a first pair finishes a feature
[22:34] is going to validate that the feature works.
[22:36] And if there's ever an exception,
[22:37] let's come together as a team and just talk about it
[22:39] and find consensus.
[22:40] It's no problem.
[22:42] We all literally signed to this and by signing it
[22:44] and being bought in, it actually worked.
[22:47] And from there very quickly,
[22:50] we were able to really start to get traction
[22:51] and perform as a combined unit.
[22:54] So even though we'd normalize though,
[22:56] like we still had a wildly different skills and experiences.
[22:59] And the next kind of secret weapon here
[23:00] was to pair program intentionally together
[23:03] in order to share that experience.
[23:06] So, you might call it prolific pairing
[23:08] where we have all of us in a room
[23:10] and we just randomly pair up at the beginning of the day,
[23:12] we work together.
[23:13] And then when somebody finishes a story or a feature,
[23:16] we switch it up and we switch up at random.
[23:18] So everyone has the experience of working
[23:19] with somebody else.
[23:21] We also did a kind of playfully adversarial game called
[23:24] ping pong pairing where one pair would start
[23:27] by writing a failing test, ping pong to their pair.
[23:30] And then the other person would make that test pass
[23:32] while writing the next test as if to throw it on a challenge
[23:35] for their pair, ping ponging back,
[23:37] who would then make that pass and write a third test
[23:39] and so on and so forth throughout the day,
[23:41] which is a great way to make sure that both parties get
[23:43] a chance to drive equally as well as to make sure
[23:46] that everything that we're doing is being demanded
[23:48] by a test.
[23:49] And we would do this until it got to be, feel pointless,
[23:51] where you're in a room together,
[23:53] and like you're pairing with somebody
[23:54] and your default reaction to every single question
[23:57] is the exact same.
[23:58] And once we got to that point, then we'd split up
[24:00] and maybe work on separate stuff.
[24:02] And so you'd see the team start to like kind of
[24:04] coalesce naturally as things got higher risk
[24:06] or higher uncertainty.
[24:08] So for us, that was what pair programming really was.
[24:12] It was a mind-meld and it was really awesome.
[24:14] It was exhausting, but we got a lot done.
[24:16] Whereas I think the popular notion of pair programming
[24:18] is much more like colocated troubleshooting,
[24:21] where you get blocked by something,
[24:22] you can get somebody else's help for 10 minutes
[24:24] or you screen-share and kind of just watch,
[24:26] like if you can ever try the more engaged,
[24:30] like ping pong pairing, I encourage you to.
[24:33] So we took that, now we're totally on the same page,
[24:36] we're performing,
[24:37] but we still don't have much insight into product decisions.
[24:39] And at this point we decided to take all that
[24:41] positive energy and pull the business into the team
[24:43] and help them succeed too.
[24:45] So up until that point,
[24:46] they'd just been throwing requirements at us over the fence.
[24:49] So we just grew the size of the team room and we invited in
[24:51] the product owner and the business analyst.
[24:54] And first we start with the product owner
[24:55] to help them specify user features
[24:58] that are written well in a way that we can understand
[25:00] as well as acceptance criteria for what it means
[25:02] from her perspective for them to be done.
[25:05] And we sat with the business analyst and said, okay,
[25:07] so given all of these exceptions criteria,
[25:08] let's write step by step instructions to understand
[25:13] is this thing actually working.
[25:15] And so we started out by writing these
[25:17] as like automated browser tests
[25:19] to make sure that the acceptance criteria were all
[25:22] fulfilled, but eventually we stopped because like 90 percent
[25:24] of the value was actually just forcing us
[25:27] to have these highly specific, intentional conversations
[25:30] to determine the product's direction.
[25:32] And from there going through those four stages,
[25:36] that team was just firing on all cylinders.
[25:38] In fact, when I reflect on it now,
[25:41] even though it took us a little while to normalize,
[25:42] once we did,
[25:43] we truly just operated as one unit and it required
[25:46] each of us to kind of come at it with the humility
[25:48] that we each have things to learn from one another,
[25:51] as well as just the genuine
[25:52] and through enthusiasm to build something as a team.
[25:56] So if you take that positive energy and you push it further,
[25:59] where does it get you?
[26:00] And I'd like to talk finally
[26:01] about just how trust benefits organizations.
[26:03] Now, if you're watching,
[26:05] this next part is gonna feel a lot like sponsored content,
[26:08] because I'm gonna talk about my company Test Double.
[26:10] And the reason is, we really founded it as an extension
[26:13] on this idea of high trust teams being more productive.
[26:18] The first thing that we started with of saying, well,
[26:20] controlling people obviously doesn't work.
[26:22] And so we need to honor one another's autonomy.
[26:25] When we first started,
[26:26] a lot of people were pushing us to kind of define
[26:28] the Test Double way of like, how do you write code?
[26:30] And I pushed back.
[26:31] I was like, you know what?
[26:32] I would never wanna work for me.
[26:35] So why would I define a whole bunch of rules
[26:37] that would just tell other people how to do their jobs?
[26:39] It didn't seem right.
[26:41] And we adopted the mantra early on trust the people closest
[26:44] to the work to do the best job.
[26:46] And it's really paid off.
[26:48] Additionally, we were remote from day one,
[26:51] but in the last couple of years,
[26:52] a lot of companies had to go remote
[26:54] and they've been reluctant about it.
[26:55] And so as tools like slack have become popular,
[26:58] the green bubble to indicate if you're online or offline,
[27:01] has sort of been treated in a lot of places
[27:04] like a punch clock or like a time in attendance system.
[27:06] And so if you go offline in the middle of the day,
[27:08] it sets off a red flag somewhere,
[27:09] and they assume that you're just not working.
[27:11] We were totally different because for us being remote,
[27:14] it was actually a way to promote autonomy among people.
[27:17] They could control their schedule
[27:19] and figure out how do they work best.
[27:22] Todd, our CEO was fond of saying,
[27:23] if your client's delighted with your work,
[27:25] we don't care when, where, or how you do it,
[27:28] as long as they're happy and you're happy, we're happy.
[27:31] And so we started there, right?
[27:32] We had a lot of people in the room
[27:34] who were highly competent,
[27:36] loved the like high trust environment and what we needed
[27:38] to be successful as a broader organization though,
[27:40] was to form strong alignment.
[27:42] And the way that we did that was through open
[27:44] and honest communication.
[27:45] And so the first thing to balance,
[27:47] is the signal to noise ratio because some developers,
[27:49] they just wanna code, right?
[27:50] They want a job that lets them ignore
[27:53] how their company makes money
[27:54] so that they can go heads down,
[27:57] but they have to acknowledge like that exposes you
[27:58] to the risk that maybe your supposedly secure job
[28:02] is not financially sound
[28:03] and they have to lay everyone off.
[28:05] For most people that would be not enough signal.
[28:09] Some people think that they want radical transparency
[28:11] from work, but every time that the company
[28:14] would get bogged down
[28:15] and every expense that somebody charged
[28:18] or some other kind of trivial thing,
[28:20] there are more important discussions
[28:22] that should be getting heard, but aren't.
[28:25] And so that might be an indication
[28:27] that there's just too much noise.
[28:29] So we're very intentional about like balancing this
[28:31] by sending periodic contextual,
[28:33] often actionable updates to the staff.
[28:36] Todd monthly sends revenue profitability,
[28:39] putting it in context,
[28:40] Mike shares our staffing forecast.
[28:44] Christine has an awesome EDI newsletter,
[28:46] Anya with recruiting, Kathy with marketing,
[28:49] and then Mary with all of the new people joining
[28:51] the company, benefits, changes,
[28:52] and leadership opportunities.
[28:55] And it's not enough just to kind of send
[28:57] the right communication,
[28:58] but it's important that they be consistent
[29:00] because I don't know if you've ever had this experience,
[29:01] but even if I don't have reason for distressed,
[29:04] if I only hear from my boss at random times,
[29:07] it's natural to worry.
[29:08] It's gonna be bad news.
[29:09] And that's why it's so important that,
[29:12] and I'm so grateful that Todd for almost 10 years now,
[29:14] every single month has sent our financials using
[29:17] the same charts and the same system
[29:19] to contextualize our current performance against history
[29:22] so that everyone has the context to respond appropriately.
[29:26] And this last part about communication
[29:28] should go without saying,
[29:30] but the best way to build trust when you're communicating
[29:33] is to stick to the truth.
[29:35] And before we were Test Double,
[29:38] I never realized how much of my brain was wasted
[29:41] on keeping track of and playing office politics
[29:44] and compartmentalizing information.
[29:46] It frees so much of my Headspace
[29:48] that I can just keep one story straight
[29:49] for pretty much all audiences.
[29:51] And to illustrate,
[29:52] I'm gonna show you the slides now
[29:53] for our most two recent internal meetings.
[29:57] First, the board last month,
[30:00] we had our Q2 results.
[30:01] And then we had an all hands meeting the same week.
[30:03] And it was actually basically the same slides
[30:06] because there's nothing to hide.
[30:07] The company's results are the company's results,
[30:09] and we want strong, true alignment.
[30:12] So having that openness communication,
[30:15] having the people in the room who are themselves providing
[30:17] a ton of value, it became clear eventually
[30:20] that success wasn't being shared equitably.
[30:21] We just had two co-founders.
[30:23] And so we decided to transfer ownership to the employees.
[30:26] And when most companies give you shares or stock or equity,
[30:30] they're doing it kind of for a shorter sighted,
[30:33] like objective or a side effect.
[30:35] Like if you go to a bunch of startups,
[30:37] they might all be long shots,
[30:38] but you can imagine that most of the time
[30:40] when you're offered equity,
[30:41] it's either to recruit you in the first place,
[30:43] retain you once you're there or potentially
[30:45] to like push you to work lots of overtime
[30:48] 'cause you've got skin in the game.
[30:49] And frankly, our story was just a little different, right?
[30:52] Like we trust people with autonomy
[30:54] 'cause we thought it was right.
[30:54] And we saw success.
[30:56] We achieved strong alignment because we are communicating
[30:59] and we grew,
[31:00] but like eventually we kept on growing
[31:02] and people would be right to ask,
[31:03] can they get a piece of this?
[31:05] And it just started to feel like the logical extension
[31:08] for our high trust culture was to share that success
[31:11] with everyone more fairly.
[31:13] So, we went big last year and Todd announced
[31:15] that we are going to transfer a hundred percent
[31:18] of the ownership of the company to the employees.
[31:21] And now that's well underway because we chose
[31:23] a well-regulated employee stock ownership plan.
[31:26] And it's really transformed the way that we all look
[31:30] at the company.
[31:31] Because it's not like that long shot startup.
[31:33] This is already a successful, proven stable company.
[31:37] And it grows incrementally and sustainably every year.
[31:42] But again, like it didn't originate from some incentive
[31:45] or compensation scheme that we had in mind.
[31:47] We just were following this same journey of trust.
[31:50] We trusted people to work autonomously.
[31:52] We trusted them with the truth and it only made sense for us
[31:56] to all share in the same success.
[31:58] And what I found over the years is like
[32:00] the further you travel down this path of trust,
[32:03] the less able I've been to predict
[32:05] where it was gonna take me,
[32:07] but so far the results have done nothing but surprise
[32:10] and delight me at every turn.
[32:12] Every stage of the company over the last 10 years,
[32:14] I am just incredibly excited about all of the people
[32:19] I've gotten to meet and the strong relationships
[32:20] we've been able to form.
[32:22] And I wouldn't trade it for anything.
[32:25] And so there's some number of people though.
[32:27] Who've listened to all of this and they're like waiting
[32:29] for the how, like how do I trust though?
[32:33] And the answer is gonna be different for everyone.
[32:35] There's all sorts of different situations,
[32:37] all kinds of different people.
[32:38] All I can do is offer a starting point
[32:40] as a suggestion to try.
[32:42] And that would be to say next time
[32:43] that you have a visceral distressful reaction
[32:45] for good reason or not, try to pause
[32:48] and see if there isn't a way to reframe it.
[32:50] Maybe it's giving yourself the benefit of the doubt
[32:53] or assuming that your colleagues are competent
[32:55] and acting in good faith.
[32:57] And if you're in a position to manage people,
[32:59] to just believe that they all want to do their best work
[33:02] and find a way to support them,
[33:03] instead of just actively manage.
[33:05] So thank you.
[33:07] If you're interested in learning more about Test Double,
[33:10] I'll hope you check us out.
[33:11] We're hiring thoughtful people who wanna improve
[33:14] how the world writes software.
[33:16] And we're always looking for clients
[33:17] who are just trying to accomplish ambitious things
[33:20] and they're interested in improving
[33:21] and they need some extra help.
[33:23] And of course, I hope you'll reach out to me directly
[33:25] if you have any feedback or even if you just wanna chat.
[33:29] Again, I'm super thankful for you taking the time
[33:31] with me here today, in talking a little bit about trust.
[33:34] And I hope that you're able to take a little bit of this
[33:37] and put it to good use.