Skip to main content

Dan Lorenc was at a restaurant in downtown Austin with friends when he received an unusual text. It was the summer of 2021, and the message was from his former coworker, Matt Moore. Lorenc was a software engineer at Google, and he and Moore had worked together on its cyber-security team. The past year had been especially challenging, as the security industry as a whole had faced a reckoning. 

In December 2020, SolarWinds reported that its network monitoring platform had been hacked by operatives suspected of working for Russian intelligence. The attackers had planted malware in SolarWinds’ code, which allowed them to access the data of thousands of SolarWinds’ clients, from Microsoft to the U.S. government. The hack exposed a critical flaw in the security of software supply chains. Just as in real-world manufacturing, software is assembled from components created by developers all over the world. At some point during SolarWinds’ development process, malicious actors inserted malware into two software updates, which were then installed by customers all around the world. SolarWinds’ data breach was a wake-up call. Without verifying the integrity of software at every stage of its lifespan this type of hack could and would happen again. 

Moore had left Google earlier in the year, and Lorenc had called him every month since, hoping to convince him to return. Moore was one of the finest software engineers Lorenc knew, and if anyone could find a solution to supply chain security, it was him. That night, Moore’s text to Lorenc said he was ready to get back to work—but not at Google. “Let’s start a new company,” Moore texted. Lorenc didn’t need to think twice. He was in. 

The next day, Lorenc put in notice at Google, bought a new laptop and began work on Chainguard, a company focused on software supply chain security. Although technically the company was born that night in Austin, Lorenc had been preparing for this moment for years. 


The late 2000s ushered in a fundamental shift in the way tech companies approached security, as alarm around new types of digital threats grew. In what became known as Operation Aurora in 2009, Chinese hackers attempted to access and change source code in a series of attacks against companies like Google and Adobe. In the Snowden leaks in 2013, a contracted employee for the NSA hacked its database and leaked classified documents.

Lorenc joined Google in 2012, working on the company’s efforts to secure its own code and protect its databases from external or internal attackers. In 2016, after he and his team had implemented Google’s new, refined security system, Lorenc shifted focus to work on Minikube, a tool for open-source Kubernetes container management. This was Lorenc’s first time working predominantly with open-source code—code that’s freely available for anyone to download, modify and exchange. As Lorenc was building and shipping Minikube to users all over the world, he realized he could have easily slipped malware into the software. The container’s label would read “Minikube,” but few would take time to comb through the thousands of lines of code in every directory of the program to determine exactly what was inside. Proper verification was something that would take an engineer days, weeks or even months to complete—bandwidth they almost certainly didn’t have, and therefore a task that would fall by the wayside.

Lorenc was terrified. Wasn’t it just a matter of time before someone packaged up open-source software that secretly contained code that could put an entire company and its community at risk? “I realized how much damage I myself could do, and how easily,” Lorenc says. “And if I could do that, so could anyone.”

The concern over how to protect software from hidden malware isn’t a new one. In 1984, an engineer named Ken Thompson wrote an article called “Reflections on Trusting Trust” that demonstrated how easy it was to hide Trojan horse viruses in a computer program. Thompson’s point was that you could never fully trust any code you hadn’t written yourself. According to Lorenc, 90-98% of code used today is open source. Most companies focus resources on making sure the tiny amount of code they write themselves is secure, but this is just the tip of the iceberg. “The 2% of proprietary code is the only part you see over the water, but beneath that is the 98% of code you’re not responsible for yourself,” Lorenc says. “In some ways, open-source code is amazing. It’s enabled all this innovation and velocity and reuse. But it’s also scary because that means 98% of your code is written by strangers on the internet. Which is great when people are well-intentioned, but not everyone on the internet is nice.” 

Despite evidence of the risks open-source code can pose to a company’s security, Lorenc learned that managing these risks wasn’t something widely prioritized. When he spoke with engineers at other software companies, supply chain security wasn’t a top concern. “Being a software engineer is like playing Whac-A-Mole. There are so many issues you have to deal with on a constant basis that you can’t spend time on preventive measures,” Lorenc says. “You only pay attention to a problem when it rears its head. At this point, there hadn’t been large instances of open-source vulnerability being a problem, so no one seemed to think much about it.”

No one but Lorenc. That is, until December 2020.

Lorenc with fierce Chainguard mascot, Charlie

Lorenc was at home when he received a news alert about the SolarWinds attack. Thousands of companies were reporting their data, networks and systems compromised—exactly what Lorenc had been worried about. He stood and walked to his car. Lorenc’s house in Austin was close to SolarWinds’ headquarters. He drove there and took a picture of himself in front of its sign. It would serve as a reminder of how important it was that he continue to raise a red flag about the software supply chain. But it was also a moment of validation—he had seen this coming, and he felt vindicated that his concern was justified. Maybe now people would take the problem seriously.

Although Google was not affected by the breach, Lorenc and his coworkers were intimately familiar with the type of attack that SolarWinds fell victim to because they had spent years building internal systems at Google to prevent it. Dan and team would end up working closely with engineers from SolarWinds to help them do the same. (The following year, when Chainguard held a conference to announce its launch, a SolarWinds engineer spoke about how instrumental Lorenc’s team had been in designing its new security system.) 

But the question of how to entirely prevent these attacks remained: When programs are built on hundreds of thousands of lines of open-source code, how do you ensure that every line is as advertised and secure? It’s a problem so large in scale that it feels almost impossible to figure out. Luckily, that’s kind of Lorenc’s thing.


Lorenc’s love of solving difficult problems began when he was a kid, fixing up damaged stock cars with his grandfather. They’d spend hours after school and on weekends tinkering together in a small garage. Once the cars were up and running, Lorenc’s grandfather (and after he got his license, Lorenc himself) would race them in community events. When the cars would break down, they would take inventory of the problems and fix them up again. This was the part Lorenc loved most—looking at the engine as a system of parts, understanding how each one functioned and then using that knowledge to fix what wasn’t working. 

In college, Lorenc pursued mechanical engineering. During his junior year he took a programming class and was hooked. “I was building stuff, but it was a lot faster,” Lorenc says. “You didn’t have to order parts and wait a week for them to ship. You could build those components yourself with a laptop, anywhere.” Software engineering offered the same thrill as fixing up old cars. “It’s that loop of trying to solve a problem, and then either seeing it work or seeing it fail miserably and getting to debug it live. You keep going and get real-time feedback until you solve it. That’s why I liked the computer world. I could do more of that in a lot less time.”

One of the first pieces of code he wrote was a program to solve Rubik’s cubes. The program didn’t work—it kept running without ever arriving at a solution. One of his peers laughed when he saw the code and showed Lorenc how to fix it. It was an important moment. “That’s when I realized, with programming, there’s always a solution,” Lorenc says. “It might not be obvious at first, but it’s there. You just have to find it.”

Lorenc began working at Microsoft soon after he graduated college. He remembers the environment as rewarding, but also stressful. There was a weekend when he and his colleagues were working overtime for an upcoming release, and during a particularly tense moment, his manager pantomimed a Monty Python sketch. Lorenc and his team broke into fits of laughter, and the mood instantly brightened. This left an impression on Lorenc: the need for a work environment that balanced hard work with humor.

Chainguard founders Kim Lewandowski, Ville Aikas, Dan Lorenc and Matt Moore.

“You’re doing what?” 

Bogomil Balkansky, Chainguard’s partner at Sequoia, sat across the table from Lorenc and Moore, not sure he was hearing them right. 

In the months since Moore’s initial text to Lorenc at the restaurant in Austin, Chainguard had officially launched, and its founders had begun building its product and seeking out investors. Lorenc says starting a company at the beginning of 2022 wasn’t easy. Inflation was rising and a wave of layoffs at tech companies meant Lorenc would pitch Chainguard to software engineers—potential customers—one day, only to learn the following week that they had been laid off. After months of meetings, they finally gained traction. The need for supply chain security tools was too great, and companies realized they couldn’t handle the job alone.

Lorenc and the Chainguard team first met with Sequoia in the spring of 2022. Balkansky was impressed by Chainguard’s initial product in development, Chainguard Enforce, a program that scans containers and produces an itemized list of the code inside. Engineers can then review the list for malware and loopholes that might make the containers vulnerable to hackers. And so Balkansky was surprised to learn, just days after Sequoia’s partnership with Chainguard was finalized, that, according to Lorenc, Chainguard was in the midst of building out a far more ambitious product: Chainguard Images. 

In computing, a “container image” refers to a static collection of executable code. Chainguard Images would be a library of these files, made from open-source code that Chainguard engineers would verify, sign, patch and secure. With Chainguard Images, customers would no longer have to opt for open-source code they’d found on the internet and hope for the best. Instead, they could select an image from Chainguard’s library and feel confident that their code was as advertised and could not be edited by bad actors. 

This was a massive undertaking, which explained Balkansky’s incredulity when he first learned of it. To execute Chainguard Images, Lorenc and his team would first have to build a Linux distribution, an operating system on which Chainguard Images could run. Balkansky had worked with other companies who had endeavored to do this but hadn’t succeeded; he knew it was an incredibly complex task that could take years to complete, and even then there was no guarantee it would work. 

Balkansky wasn’t the only one who was skeptical. Other software companies with thousands of engineers struggled to believe that Chainguard, with only 25 engineers, could not only build a Linux distribution, but also comb through and verify numerous libraries of code. “Yes, it seemed impossible,” Lorenc says. “Yes, just based on the sheer numbers, the lines of code we would need to pore over and adjust to make secure, it was daunting. Everyone else looked at it and saw it as too hard of a problem, too massive to actually solve. But it seemed obvious it was something we needed to do.” Lorenc shrugs, “To me, it didn’t seem impossible, it just seemed hard. And I’m okay with hard.”

“Hard” paid off. Lorenc and his team spent a year working tirelessly on Chainguard Images. To motivate and energize his team during long days and even longer nights and weekends, Lorenc pulled from the lesson learned at Microsoft, making sure that hours of focus were interspersed with jokes, lighthearted stories and plenty of room for laughter. 

Balkansky was sitting in a conference room in early 2023 when Lorenc let him know Chainguard Images was live. Balkansky’s first thought: “Pretty damn impressive.” 

Today, Chainguard has rebuilt around 80% of the open-source code most companies use, and because Chainguard is the only company offering access to this type of library, they’re racing to keep up with demand for the remaining 20%. Chainguard also takes custom orders—if a company needs a certain container specific to their product verified, Chainguard will take it on.  

As time consuming and challenging as building out Chainguard Images continues to be, Lorenc recognizes its payoff for users. It’s akin to building a house—yes, you can use cheap and readily available materials for construction and deal with a rickety staircase or a toilet that won’t flush later, but Chainguard’s philosophy is that it’s worth the extra time, money and labor to start with high-quality products. Lorenc believes in taking every measure possible, no matter how complicated it might be, when it comes to keeping software and its users safe.


Working in security necessitates a certain amount of paranoia—the fear of what you’ve missed, of an unseen crack or fatal flaw, is relentless. Lorenc’s use of humor to assuage the high-risk, high-anxiety environment has paid off in more ways than one. He regularly makes his own memes, many of which have gone viral within software development communities. One meme shows Ken from the Barbie movie, with the caption “My job is just CVEs.” (CVEs meaning “Common Vulnerabilities and Exposures.”) Another depicts Ned Stark from “Game of Thrones,” with a caption that reads “Brace yourselves. VEX is coming.” (VEX is an open platform for sharing information about software vulnerabilities.) 

Lorenc’s goal in creating memes was originally to provide levity and form a community with people working in security, so he was surprised when his social media presence ended up being one of Chainguard’s most successful marketing strategies. Lorenc regularly sees his memes making rounds on LinkedIn and X, formerly Twitter. Chainguard’s TikToks are also popular among engineers, many of which feature Lorenc, in his trademark backward baseball cap, speaking to viewers as though he was guiding a friend through a casual tutorial on software security. Lorenc attributes the love for Chainguard’s social media to his users’ desire for authenticity. He’s not some faceless CEO of a corporation; he’s a software engineer, just like his customers, and he speaks their language. It’s not a traditional avenue for a security company to market itself, but Chainguard has always done things its own way. 

Despite his numerous duties as CEO, Lorenc’s favorite part of the job is still working with other engineers to solve complex and granular problems. He thinks Chainguard was able to attract some of the best engineers in the industry precisely because the company has such lofty goals. “The people on our team were drawn to the difficulty of our mission,” he says. “Smart people want to work on hard problems.” Often when people encounter what seems like an impossible series of problems, they get overwhelmed, but not Lorenc. He knows every inch of this engine, and working on it part by part, he’s confident he can get it to run.