Ep. #7, Investing in AI with Wei Lien Dang of Unusual Ventures
In episode 7 of Generationship, Rachel speaks with Wei Lien Dang of Unusual Ventures about the landscape of Generative AI and what it means from an investment standpoint. They analogize previous platform shifts to better understand what’s happening with LLMs today, and together they unpack the qualities investors seek out when investing in the world of AI.
Wei Lien Dang is a general partner at Unusual Ventures and leads investment in devtools, security, data infrastructure, and open source software. He’s spent more than a decade building products at startups. Wei was a co-founder of StackRox, a cloud native security company, where he was responsible for product management, user experience, product marketing and technical evangelism, prior to its acquisition by Red Hat. Before that, Wei was Head of Product at CoreOS and has spent time on investment team at Andreessen Horowitz.
In episode 7 of Generationship, Rachel speaks with Wei Lien Dang of Unusual Ventures about the landscape of Generative AI and what it means from an investment standpoint. They analogize previous platform shifts to better understand what’s happening with LLMs today, and together they unpack the qualities investors seek out when investing in the world of AI.
transcript
Rachel Chalmers: Today I'm so pleased to welcome Wei Lien Dang. Wei is a general partner at Unusual Ventures and leads investment in dev tools, security, data infrastructure and open source software. He's spent more than a decade building product startups. Wei was a co founder of StackRox, a cloud native security company where he was responsible for product management, user experience, product marketing and technical evangelism, prior to its acquisition by Red Hat.
Before that, Wei held product leadership roles at Core OS, also acquired by Red Hat, Bracket Computing, acquired by VMWare and Amazon Web Services. He's also spent time on the investment team at Andrews and Horowitz. Wei has an MBA from Harvard Business School, a JD from Harvard Law School and a bachelors in Applied Physics from CalTech. You were not skipping your homework, were you?
Wei Lien Dang: There's definitely too many degrees, Rachel. I'll say that.
Rachel: Wei, how are you seeing the investment landscape for AI? From my point of view, nothing's getting funded that doesn't have an AI angle and I know a lot of people are worried that we're just throwing money at everything.
Wei: Well, it's definitely an exciting time when it comes to the landscape of generative AI and what that means from an investment standpoint. I mean, you certainly have a lot of capital going into companies really addressing and looking to incorporate AI all across the stack. We would characterize it as a generational platform shift that is going to have lasting, significant impact. Now, I think the question really is then where are the real opportunities in the short and near term, versus longer term? And how should people think about it?
So I do think we're definitely in the hype cycle, for sure. I think there are definitely a lot of AI companies that over the long run are not going to make it, but I think from an investment standpoint we really try to understand and seek out the opportunities to partner with founders who are looking to go after big opportunities where AI can have significant impact. I think the most exciting thing around AI is it holds potential to really fundamentally change the way that we all interact and leverage software.
I think there are stepping stones to what that means in terms of the infrastructure and tooling required, the applications that will emerge leveraging AI. But definitely we're very excited and I would say cautiously optimistic about the promise that AI holds for all of us. I think in terms of some of the bigger opportunities, I focus a lot on infrastructure software as well as applied AI and I think there's a ton of opportunities in terms of what are 30 million developers going to need to be successful with AI?
How are they going to take advantage of Large Language Models and these different innovative technologies and how are they going to build new applications with that? I analogize to previous platform shifts when you saw the shift to cloud and the emergence of cloud native architectures. Each of those really drove significant ecosystems of new companies who were helping us solve people with those types of problems.
Rachel: I do want to come back to the point about how developer work is going to change, because I think that's incredibly interesting. Do you worry at all that generative AI in particular with its huge hunger for cycles and energy is giving an advantage to the incumbents, that wasn't necessarily the case with previous platform shifts like cloud?
Wei: I think the incumbents have certain advantages, I would say, Rachel. I think it depends on the lens in which you look at them. Certainly there's very large tech companies who have been working on AI, have been investing significantly in research for years and years and that has given them a headstart. There's also this notion of existing incumbents in different market categories who are going to augment and extend their products using LLMs and AI and models and so on.
So the way I think about it is that, as a startup, LLMs by themselves don't necessarily justify the existence of a new standalone company. I think you have to think of it through the lens of what is really the why now that drives the mission for a brand new company and how can someone ride the AI tail wind. I think that has different impact in different categories.
For some incumbents, I think the threat of AI or the promise of AI is almost existential to their business, and in other cases that's not the case.
I think the example I would cite is pick a company in the observability space, like Data Dog, since we're talking about developers and dev tools and things like that. You can definitely see them extending and augmenting their capabilities with LLMs, and so if you want to go and build a new observability company, saying that you're going to leverage AI is not enough in and of itself. I have seen many of these companies with really the same story and so you don't necessarily come across as differentiated and you might question whether your insight is truly unique or not.
Versus I think in other categories, and it could be observability as well, you could completely rethink the workflow or user experience, or come up with something really differentiated using AI and AI becomes more of a means. Those are the types of opportunities that I'm more interested in, so I think incumbents can definitely have an advantage but it also, I think, is not one size fits all. I think it's somewhat more nuanced and dependent on the individual category and to some extent, how a company thinks about it.
Rachel: Makes sense. Listeners who are keenly interested in applications of LLMs to observability, don't miss our episode with Liz Fong-Jones and Philip from Honeycomb talking about their Query Builder. Wei, are you looking at other areas of AI? Are you looking at different machine learning? Where do you see the most promise?
Wei: It's interesting that you ask that because in some respects, AI is an extension or a part of the continuum that ML lives on. You've had people working on ML for a whole generation of what we would call MLOps companies that have existed for some time, and I think the real difference with AI has been that, in what I would call the MLOps era, and I say that like the previous generation and frankly it's not that long ago.
Rachel: 18 months.
Wei: Yeah. I think for me it was more characterized by a time when you had a certain set of companies with dedicated machine learning engineering teams who were solving for particular use cases. Underwriting, fraud detection, recommendations. I think what's really changed and what this newer set of generative AI models drives is the opportunity to really expand beyond all those use cases and make AI ubiquitous across all sorts of different types of applications.
To me I'm actually most interested, if you're talking about how you go from that set of companies to where we are now, is how people can really make AI more accessible. I think making it easy for any engineer out there to take advantage of the capabilities of Large Language Models and so on. I think that there's also this whole dimension of you have trends like open source models and so on that are democratizing and making more options available to users, alongside proprietary solutions that are giving you an API.
Basically a very easy way to interface and actually use these capabilities that don't require a whole specialized engineering or Ops team to take advantage of. So I view that as, in some sense it's all ML, but I think really the big distinction is more like what does it mean for engineering teams? What does it mean for businesses who want to build new applications or have internal use cases for LLMs? And how can they more easily take advantage of it, versus the fact that ML is quite difficult, relatively difficult to implement if you went back not too long ago?
Rachel: That's something that I'm really excited about, too. The potential of democratization in our working lives. We've seen programming gone from five million to 50 million. We've all been wondering how do we get to 500 million and I think a lot of people are looking at gen AI and are thinking that this could be the shift that makes that possible.
Wei: Yeah. I certainly think so. There's this notion I think that every company, nearly every company will become an AI company at some point, because even if something isn't fully AI native there's a lot of opportunities to augment the existing with AI. Take cybersecurity, take financial services, take these different categories or verticals. At a minimum, I think AI can often provide more an assist or a supportive type model alongside what people are already doing in order to make them either more productive or to unlock the ability to drive better outcomes, or be more efficient or more effective.
So I do think there's this idea that across all these different potential applications, it's not necessarily that AI is going to full on replace things as they look today. But I do think that things will look markedly different if at a minimum you have a Copilot model for a lot of different applications that people are trying to build.
Rachel: Are there areas of AI that you see being massively over hyped at the expense of substance? Are there areas you think are over capitalized right now?
Wei: Well, I think that there's definitely a lot of attention on the outsized amount of capital that are going into foundation model companies. I wouldn't necessarily say that it's not warranted, though. I mean, I think to go build in that space, it's capital intensive and you need to be appropriately positioned to be able to go train and build and develop a new model. I do think there are areas around what we would call applied AI, certain things at the application layer where there's many, many people trying to incorporate AI, a lot of funding going towards it. But with relatively little differentiation.
So I think, as a founder, going back to what I was saying earlier, I think beyond the fact that you're using LLMs or some form of AI, where does your unique insight stem from? Are you rethinking the user experience in some way? Are you building some kind of domain specific model or something that provides more defensibility? I think it's worth thinking through these things because there's too many people out there right now that are over hyping the promise of AI. I think the other are that I would point to that seems over hyped is people getting ahead in terms of what they're promising, what AI is capable of.
I hear pitches around it's going to full on replace software development teams right now, and so I do think that people are leaning a bit forward on the skis right now and some folks are in terms of what they're trying to promise. I think that if you look at what these LLMs are capable of, there's certainly a lot that's encouraging and a lot that's is immediately applicable but there's still a lot of room and a lot of challenges that need to be addressed for them to actually have the impact that over the long term we aspire for it to.
Rachel: Let's talk about the developers. How do you see gen AI in particular enhancing or detracting from the work that developers do? And feel free to mention Prompt, which is how you and I met. Mike Sawka's company which is building a new terminal.
Wei: Yeah, and I'd also love to hear some of your thoughts on this, Rachel. I think the way that I look at this space is in the last couple of years there's been incredible impact of products like GitHub Copilot which I would point to really as today one of the most widely adopted applications of LLMs and significant success. It crossed 100 million in revenue, millions of developers using it, and I think that is a model for how AI can help here and now.
So you have this AI coding assistant type model that really augments the experience for developers to help make them more productive. I think with what Mike is building at Prompt, it's also along a similar line where here you have... consider the terminal, it's been around for a long time, I would say has not really benefited from or experienced much in the way of innovation.
I think AI is one of those catalysts that can provide a step change in terms of making it a lot better in helping developers be more productive and unlock new types of workflows in terms of how they go about their daily work and really allowing them to focus on what most of them enjoy most which is the creative aspect of building new things. A lot of folks are saying, "Does AI end up cutting into the number of developers out there over time?" My view is I think we're a long, long way off from AI being able to actually, or AI agents, being able to actually replace software developers.
I think that the assistive model is what we'll see for a long time, and then I would actually... Say if you expanded your notion or definition of developer, I think AI actually has the ability to grow the number of developers. There's this concept of an AI engineer who can easily work with LLMs and AI models, and I think that that's a really awesome thing. Where over the last couple of decades you've seen the number of software developers increase enormously, to about 30 million now. I think if you can turn more people into AI engineers, that's a huge opportunity. But I'm curious to get your thoughts, Rachel, on how you think AI can either help or negatively impact developers and what they do?
Rachel: I have so many thoughts on this, thank you for asking. I do see a tremendous bifurcation in how I would say two very broadly groups of programmers approach gen AI and incorporating it into their work. One I would characterize not in a derogatory way, but the naïve use where people ask ChatGPT to give them some code for something and then they cut and paste that code. That can work on a very small scale, but I would say that characterizes a tactical and slightly blinkered view of what a coder is trying to do.
When you talk to really senior systems architects, I had Mark Wallace on this show for another episode. He runs the infrastructure for a big gaming company called Global Worldwide. What Mark and programmers like him, architects I would say, are really doing in their day to day work is balancing many, many trade offs in a multivariate space in order to optimize resource use against particular desired outcomes.
So they have this enormous system, spinning in dimensional space in their head and when they're harnessing ChatGPT to help with that, they're using it in that very tactical way. But they also have the experience and the context to be able to easily spot hallucinations and misapprehensions, and so they're able to incorporate even AI generated code in a way that accounts for those large systems issues. I guess there's a third model that I hope to see emerging which is that tools like LLMs will help junior programmers become senior architects by helping them to develop that intuition.
Wei: Yeah. I think it's really interesting when you think LLMs have the ability to turbo charge what a developer is capable of and I do think that certainly aside from autocomplete functionality and working within a codebase, I do think there's an opportunity to also help developers level up. In fact, I think if you look at developments in the AI coding space today, a lot of people have this idea of enabling and coding is just a function as a junior developer that one can interface with in a more collaborative way.
Then you potentially start to supervise the work of that AI agent. But again, I think from a standpoint of it becoming fully autonomous, we're still a ways out. But I think it's very encouraging to actually see and observe what these folks have demonstrated using LLMs today.
Rachel: Do you worry about the risks of particularly your portfolio companies harnessing commercial LLMs which a lot of times are black boxes? Do you factor that into investment decisions?
Wei: I think of it more as technology choice that companies and businesses have to make. I think there's implications of those choices. For instance, going with your commercial or proprietary solution meaning your cost structure looks different, it could mean that as you're trying to... depending on the type of company you're building and the type of customer you're selling into may mean people have more questions about, "Hey, how is my data being used?" If you're utilizing OpenAI or something like that.
But, to me, I think it's a technology choice and I think that it's one with trade offs. One of the things I've been most excited about and have written quite a bit about is all the developments around open source models in the last year or so and I think the great thing about that is now teams have more choice and flexibility in terms of what they adopt outside of the commercial solutions.
But that has its own drawbacks too, deploying, managing, running your own models, maybe standing up your own infrastructure. That's also not necessarily for the faint of heart. So I think startups in particular should weigh these different trade offs and figure out what's best for them and just be aware of the implications on both sides. But I wouldn't say I'm worried, but I think that I do spend time discussing those considerations with the founders that I work with.
Rachel: When you're considering an AI or any other investment, what qualities are you looking for in founders?
Wei: Yeah, it's a great question, Rachel, because there's certainly a set of things that we look at or consider that are maybe more specific to an AI or an AI enabled company. But it really fundamentally starts with the why now, and I think that's true for any company whether you're AI or not. What's the reason you're needed? Why does the world need another company that does what you do? I think along those lines, what is your unique insight or almost unfair advantage to go after the opportunity?
So I think that's true regardless of sector and it's something that I spend a lot of time thinking about with the companies that I get to know. But those are some of the things that I would look for. I think the things that are maybe more specific to AI, we're in an emerging market around all things AI and so I think really understanding and trying to assess where do the relative problems and pain points exist today versus what will come later is really important because as a startup you have to survive in advance.
I think that startups with really identifying what I think of as an urgent pain point and there's some parts of building an AI today where people are like, "Yeah, this is a huge friction, it's a blocker." Versus there are some things that are nice to have and I'm going to go figure that out later. So I think really making that distinction and it's hard in a market that is very noisy and there is all this attention and things are moving so quickly and trying to stay on top of all these developments from week to week is pretty challenging.
But I think that it's really worth, as a founder, putting in the time to be thoughtful about some of those questions and at the very least I think it'll position you to increase your likelihood of success.
Rachel: Yeah, I think founder-market fit is the top line criterion that's all too often overlooked. Is there something that you wish founders knew at the beginning of their startup journey?
Wei: It's funny you ask that. I think my answer would be that if they knew how hard it was, but then if they knew how hard it was, maybe they wouldn't want to start something. But I do think that one of the things I see many successful founders share is this humility that they don't have it all figured out and the faster you can learn and the more open you are to learning and adapting quickly, I think puts you in a much stronger position as a founder of any company.
Now, I do think you have to have a strong point of view on things. But I would say somewhat loosely held. Especially as you're in the early phase of company building, getting to product-market fit. I think that's where it's paramount because it's extremely rare that someone gets it right, right out of the gate and the more you have this mindset of I'm going to iterate quickly and I'm going to learn quickly because I have some good hypotheses or assumptions but I don't have the full picture yet, I think the better off a founder will be.
But that takes a certain degree of both humility and also self awareness as a founder but also using that to inform who do you team up with or who do you bring on as your supporting cast and how do they complement you to go do that process effectively, to go learn really quickly, to go sniff out the key problem and pain and what that means for the company.
So I would say that's the thing I wish founders would know, and it comes from a place where having been a founder myself, I know the pain of getting the product wrong and it's very hard to unwind. When I look back at my own journey, we could've mitigated against that risk if we had just been more open to questioning some of our assumptions.
Rachel: I, like you, am drawn to relatively low ego founders and I think that does make it harder for them in an industry that prioritizes insane self confidence over many other valuable qualities. But I think what characterizes the most successful low ego founders is that they've found a user base, a customer that they are willing to champion and that they're willing to exert resilience and grit for the sake of other people in a way that they might not necessarily do for themselves. I think that makes people very evangelical and very empathetic with their beachhead customer base in a way that's really generative of great companies.
Wei: I totally agree with that, Rachel.
I think that authenticity matters a lot, in terms of how you engage with really everyone, whether it's your users and your customers, or your internal team or wherever it is. I think as a founder, bringing that empathy, bringing that authenticity, it really counts for a lot.
I would say especially even more so there's more attention on that, for instance if you have a community and you have a following where you're not just leading a company. You're actually leading a group of people who are enthusiastic about a particular mission and problem that you're solving and they want to be engaged and they want to be active and they want to be involved. I do think what you're saying is really, really important and it can't be outsourced. It has to come from you as a founder, it can't be delegated and I do think that that is sometimes not fully appreciated necessarily at the outset by some people.
Rachel: I think there are founders who are like, "DevRel is something I can hire someone to do." And they don't understand that it's that connection with, as you say, the community. Not every founder is going to look like a traditional tech evangelist or DevRel person, but there is a way that every founder who's solving a meaningful problem can find a way to communicate authentically with the people that they're trying to work with. I think it is part of the founder's journey to find that voice, to find that passion and that ability to speak clearly.
Wei: I totally agree with you, Rachel, and I'm so glad you brought this up because I work with a lot of, especially open source founders, who face this type of situation. In fact, I was just chatting with a founder last night of a really successful AI platform and he was like, "Being like this prominent voice and so on doesn't come naturally to me. But I know I need to engage with my user base and my community."
So what we were talking about and it was something similar to you, is finding the right way to bring out that voice. Where I do think things like folks like DevRel and Dev advocates and evangelists can help is they can help amplify it. But I do think as a founder you have to think through what is your voice because in most of these cases, people are actually really hungry for thought leadership or hearing from someone they respect in terms of how to think about a problem space.
Especially in this emerging AI paradigm, people are like, "How do I go about solving some of these things? What is the right way to think about it?" And people are actually really excited about hearing from someone who is knowledgeable, who has a unique point of view. I think there's also other scenarios where you might need to be a little bit controversial. You might need to be a bit of a lightning rod and you can't necessarily shy away from it.
But I do think that can come in different forms, it doesn't necessarily have to be you're always vocal and on the front of Hacker News. There's other ways to engage with your user base. Get to know people in smaller settings, in one on one, virtually, there's a lot of opportunities to build relationships and have that authenticity come through. In the long run, that really helps build trust within your user base and that can lead to really awesome things in terms of how it propels either your project or your product or eventually your business.
But I very much strongly believe as you were alluding to, DevRel and evangelism and these things, they can't be delegated. At least certainly not in the early days. Community leadership has to come from the founders, and the voice from that has to come from the founders. I do think there's ways to stand at the megaphone, so to speak though, to get the message amplified.
Rachel: When I think about the breakout successes of my career, VMWare, Launch Darkly, Aviatrix, Honeycomb, not only did the software reflect the character of the original founder, the community that was drawn to the software reflected an interest in hearing the voice of the founder. Because your code and what you write and what you say, they're all different aspects of how you solve problems in the world. I just find it fascinating, we've built this industry that we pretend is entirely STEM but in fact the software skills and the ability to write and persuade are intricately braided all through it.
Wei: Yeah. I would say in some markets it's essential. Take the cloud native community which emerged over a decade ago. For many people that really changed the way they thought about how you think about what I would actually call overall go to market because the way you engage with your user base, the way you help foster and think about adoption was very different from the world of just pure enterprise sales. So I definitely agree with you, and I think this is where whether it's open source or a PLG or whatever it might be, the concept of community is really powerful and provides a huge advantage to companies who can grow a significant community and a successful community in addition to the underlying technology.
Rachel: So Wei, I'm going to make you the lord emperor of the solar system for the next five years, everything in our industry goes exactly the way that you hope that it will. What does the world look like five years from now?
Wei: What I think about and what I'm optimistic about, Rachel, is certainly we're in the thick of the early days of AI right now and I think if it plays out, I'm really excited for a whole new generation of companies and sorts of businesses that we can't even anticipate emerging. If I go back to cloud or mobile, these platform shifts, they gave rise to a whole new class of companies and, frankly, business models that didn't really exist prior to the emergence of these underlying foundations.
So that's what I'm most excited about, it actually makes it hard to predict what that would look like, but to me I actually think AI similarly can unlock a whole new set of generation defining companies that we just don't understand yet. But I actually think that's one of the privileges of you and I and others spending time where we do, as you can see it unfold day after day, week after week. I think that there's a lot that the next five years will bring.
Rachel: Another thing that we've been working on during these five years is a colony ship to Proxima Centauri. As Lord Emperor it's your honor to christen it, what are you going to name our colony ship?
Wei: I didn't anticipate that question, Rachel. I would like to say I don't know the name, but whoever was the son of Apollo. Actually growing up I was a big space fan and so I think in the same way that we really pushed the boundaries with Apollo and I guess now Artemis, I think it's a successor to that. So whatever name that is, I would pick that one.
Rachel: Oh, Asclepius is the son of Apollo. The hero and god of medicine.
Wei: Okay. That's a bit of a mouthful. But I think it pays homage to the notion that we stand on the shoulders of giants which is true in tech and software as well.
Rachel: Apollo had a lot of children. Another one is Orpheus which would make for a very musical ship, but I do like the idea of the god of healing and medicine being our patron saint on the long journey.
Wei: Yeah. That's cool.
Rachel: Wei, it's been a delight to have you on the show. Thank you so much for your time.
Wei: Thank you for having me, Rachel.
Content from the Library
The Future of AI Code Generation
AI Code Generation Is Still in Early Innings AI code generation tools are still relatively new. The first tools like OpenAI...
The Future of Coding in the Age of GenAI
What AI Assistants Mean for the Future of Coding If you only read the headlines, AI has already amplified software engineers...
Enterprise AI Infrastructure: Compliance, Risks, Adoption
How Enterprise AI Infrastructure Must Balance Change Management vs. Risk Aversion 50%-60% of enterprises reportedly “use” AI,...