January 22nd, 2025 × #ai#coding#tools
Windsurf forked VS Code to compete with Cursor. Talking the future of AI + Coding
Discussion with Wind Surf CEO and VP of Product about the company's history, coding AI tools like Wind Surf, and the future of software development
Transcript
Wes Bos
Welcome to Syntax Today. We have Kevin and Varun on from Wind Surf.
Discussing AI coding tools and how they will affect developers
Wes Bos
We're here to talk today about AI encoding, and there's a lot going on in this space right now. Is is it gonna replace a developer's entirely? Is this a tool that you should be using? Can you trust the code, that it it's cranking out? There's just so much going on, and we thought, like, let's have him on to just talk about obviously, we're gonna talk about Windsurf and and the editor that they forked Vercel Code. Right? And, we're gonna talk about that and how it compares to Copilot and Cursor and all that. But, also, I'm I'm more interested in just, like, talking about, like, what JS the state of web development look like with all these crazy AI advancements in the last couple Yarn? Even feels like the last 6 months, everything has really cranked up as well. So welcome, Kevin and Varun. Thanks for coming on.
Guest 1
Thanks for having us. Good to good to be I guess, this is my 2nd time on the podcast, so thank you for having me back. And then Yeah. His
Wes Bos
debut. Yep. Do you wanna do you wanna go wanna give us a quick rundown of, who you are and what you do and and kinda place what Wind Surf is in in in relation to Codium?
Wind Surf started as a GPU virtualization company then pivoted to generative AI
Guest 2
Yeah. Happy to do that. So I'm Varun. I'm the CEO and cofounder of the company. If we go really, really back, so close to 4 years ago, the company actually started as a GPU virtualization company. So we built software to make GPU infrastructure really, really efficient. We even built compiler software for that. The middle of 2022 rolled around, and we saw that generative AI was going to be huge for deep learning. Right? We thought all the deep learning workloads were fundamentally gonna be generative AI workloads. And in that world, we were early adopters of GitHub Copilot, and we thought building an application would be really exciting for us to take our infrastructure and go ahead and build an application.
Guest 2
And we built up Codium for that reason. And Codium started out as purely an AI autocomplete tool. We provided it for free. Node, the product JS still entirely for free. But the trade off we made was we were an extension in every single ID. That means Versus Code, IntelliJ, PyCharm, Eclipse, Visual Studio, you name it, we had support in there. And we worked with some of the largest companies out there to not only deploy the product internally in their firewall, but also personalize the results to the private data inside the company. Our goal here was, hey. Like, look. These models can generate sort of generic suggestions, but what happens if we could sort of do it in a way that was more personalized? And we actually trained and ran our own models to actually go ahead and do that. And just to skip to where we are now, what we felt was a couple months ago is that we felt that, hey. We could build a premier experience, and we were getting limited by what the ID fundamentally had by being a Versus Node extension. Yeah. I mean, potentially a reference extension. We wanted to make a premier sort of experience, especially given where we think AI is going and how the the way IDs are gonna get used are gonna fundamentally change it. Happy to talk about that a little bit more, and that's why we launched, Windsurf actually in the last 2 months.
Scott Tolinski
Yeah. You know what? I've always appreciated about Node in the past was just how fast it was, and it it had, like, a definite eye towards general user experience. So when Wind Surf was announced, it's not surprising that the focus is maybe on rethinking user experience a bit more in this space.
Wind Surf focuses on good user experience for AI coding tools
Scott Tolinski
Has that always been, like, forethought for you guys to say user experience in these tools needs rethinking? Because, you know, sometimes they can just feel slapped on, or we're we're either adapting our coding tools towards how the LLMs work or we're just shoving them into the editors. Right?
Guest 1
Yeah. Completely. I think that's one of the primary motivations for doing the fork. It felt like the generative AI coding space was changing. At least the models and the intelligence were just getting better, but the user experience wasn't able to kinda meet the power of the LLMs. I mean, you had examples of people just generating code. You know, it's brilliant code in in Sanity or in chat GPT, and then now you've got this gap, right, between the browser and the editor. Like, we really had to to figure out how those 2 things could live together in harmony. Yeah. So thinking about like, one good example of this is if you've used WinSurf, we have that cascade action bar. So it's kinda like you can step through files. It's almost like a merge conflict resolver. Right? You're saying, oh, I have these changes from the AI. I have my own changes. I wanna review kinda step through those changes and and accept and reject the ones that I want. Those are examples of patterns that would feel are are kind of impossible or would be possible, but feel really bad if you were an extension because it doesn't feel native to the experience. And so that's
Guest 2
you hit on a a good point, which is user experience is is kind of king. Yeah. And I guess maybe to add 1 more thing to that, we're not a company that's like, hey. These things are completely going to replace engineers. Because if we design the product to be like that, it would just be this blob that ran asynchronously, and just you kind of look at the response afterwards. We really care about the human AI sort of interaction paradigm. Right? Like, that's what autocomplete was in the very beginning. And right now, I guess the hard part about the paradigm is the AI is writing more and more of the software. So, actually, the role of the way the human and AI sort of interact is fundamentally changing. And to that, we needed to go ahead and build a new interface. Oh, yeah. Yeah. It makes sense that Scott and I were even laughing JS the reason we initially switched to Versus Code was because
Wes Bos
Versus Code provided a much more rich UI for the different types of things. You know? There's there's autocomplete. There's suggestions. There's tooltips, and you hover over top. There's there's sidebars and things like that. Mhmm. And then as we get into figuring out what does software development look like as people, we sort of wonder, like, do we need a different UI? Is this even is the code editor even the the right UI for this type of thing? So, like, I have to ask is like, obviously, Microsoft has their own, like, dog in this race as well. Are they just moving, like, too slow, or do they not open up certain parts of it? Or, like, well, the they eventually are probably they're already implementing a lot of these features as well. Do you see a feature where Versus Node also has this?
Guest 2
Yeah. I think the way I sort of think about it and the team sort of thought about it JS, I think that the limit for a software in general, like, in theory, sort of they could have anything in the future. I think, like, that's objectively true. I guess the problem for us is, like, if we have to build a great experience and we are limited by whatever time line that they have, Yeah. It it kind of means that, hey. We can't tell our users, hey. We're smart, and our product is great. And then meanwhile, people don't want to use it. I'll give you maybe a simple example of this. Truly, truly simple example. And this is not even has anything to do with cascade, which is the agent like part of the product. So we have this feature called super complete. And the idea of super complete is it's the next rendition of autocomplete, which actually does next intent prediction. As you're writing code, it also refactors code for you. And the way we actually ended up building that was for Versus Node because they didn't let us provide a edit sort of functionality, a passive edit functionality. We needed to go out and dynamically generate PMGs and stick them on the Node, and show them, on the side. And by the way, they were hard to get aligned given what the font size was. And we actually noticed the moment we implemented it natively in Wind Surf, the acceptance rate went up by multiple factors. Yeah. Same product. Same exact product. Right? And I guess the fundamental reason is because what we think is valuable to the end user may be different than what Microsoft thinks is valuable to the end user. They might think actually putting a lot of tooltips in GitHub is valuable and putting a lot of things in Versus Code is not as valuable, but that ends up indirectly hurting us, right, if if that's the case. So I think we started from what is the best possible experience we could build, and we were constantly realizing that we warp fighting against what was accessible in in Versus Code. And it was not even just because they made proposed APIs that no one else can access. They also did that. But on top of that, they just didn't have a lot of functionality that we really wanted. So I I think maybe maybe the way I sort of look at it is every editor wins because of some disruption that's happening in the market. Right? Like, why did Versus Code win? It won because TypeScript became popular, and then also you were able to build this very clean sort of ID experience on top of that and then a very good extension framework that enabled a single ID to handle a bunch of different languages instead of JetBrains where everything every language JS its own ID. Right? So you have this unified view. And I think now with AI, there's going to be a better ceiling on what's possible, and we wanna hit that ceiling. I guess it was, like, the rough take we had. Totally. Yeah. That UI matters a lot too because I I find myself,
Wes Bos
like, over the last 6 months of when I asked AI to do too much, I would Scott get overwhelmed and just be like, I don't Node, and then close it. You know? And I've I found myself JS the UIs get Vercel, allowing it to do a lot more. And and a lot of that has to do with, like, how it's how it's actually presented to you. And that's actually kind of a a funny thing as well because and maybe we can get into this JS how you provide context to the model is extremely important. Like, we had Kevin on, I don't know, like, probably a year ago, and he was talking to us. And we're like, hey. Like, you're using the models that everybody else is using. And he's like, it's not just about the model. Like, we can train models and whatnot, but it's what's really important is is about the context that you provide to it. You know? Like, what pieces of information are you you giving it so that it can provide you the best possible output? So I kinda wanna, like, go in that direction JS, like, how do you make the best possible completion or generation? You say, alright. Make make a new file that is a component and do x, y, and z. How does it know that I'm using React? How does it know to use specific camel casing and all those things that I the way I code?
Guest 1
Yeah. I think the paradigm shift with Wind Surf was so I remember having this conversation with you, and it's it was a lot of fun to think about and kind of audit all the all the areas of context that we're pulling. And I think back then, you know, we were looking at your files. We're looking at some remote repositories, but we've kind of paradigm shifted the way that we're thinking about that because Wind Surf is inherently like an agentic editor. So what that basically means is the way that we retrieve context is as if a human would kind of retrieve context. So if you think at the core of the editor, there is this sort of like entity that is like, okay. Wes wants to implement a new sign up form. What are the things needed for the sign up form? You know, you're gonna start crawling your code base as if a new, let's just say, junior editor or or junior developer just onboarded onto your team. They would go. They read the documentation.
Guest 1
They might go and look at some examples of of past PRs or examples of of files that already do this. They'll look at some configurations.
Guest 1
They'll be reading. They'll be searching and.
Guest 1
And so this is the way that sort of humans kind of digest information. And in that way, we wanted to model how Wind Surf would interpret and retrieve information similarly. So when you make that request, it decides kind of a plan of action. It is not just pulling everything all at once and then just hoping fingers crossed, like, I have a 1000000 lines of code. Let me just resolve this into the the solution. It's like, okay. No. 1st, I'm gonna go look at examples. I'm gonna go look at, yeah, your design system. I'll go look at your tailwind config. Okay. I think I have enough information now. I'm gonna start generating, and it'll do it in piecemeal. Think of it as as a human, you might set up your validation library first before you set up your, I don't know, your post request. And and in this way, you're kind of stepping through the solution as opposed to just, alright, context retrieve,
Scott Tolinski
dump. And so it's a it's a bit of a bit of a different pattern. So when you say Ad Agentech, just for the audience, I know you did explain it a bit there. But just to be super clear, like, it's it's using the concept of agents, which are they're they're going off and executing tasks without you. Right? So that's that's what you mean by that is, like, the editor itself is executing
Guest 1
tasks like a a human would behind the scenes. That's what you're saying? Yes. More or less. So when you think and if you use the cascade sidebar Wes WinStar, if you really see what that real like, what that kinda means. So we can run terminal commands on behalf of the user, inside of their kind of, like, file system. So you wanna push a repository, right, it would it would say, oh, okay. How would I push this code to a repo? Alright. I'm gonna use these Git commands. I'm actually gonna execute that on your behalf. It'll read files. It'll read from the Internet. So it'll pull links and and try and resolve those links. So it is behaving as if, it has kind of, like, tools at its disposal to step through and kind of action by action, figure out what to do next.
Guest 2
Yeah. Take. I think one of the maybe super quick things that we had when it came to, like, what we wanted to get out of these products is Wes think, basically, there's, like, a graduation happening in terms of the capabilities of these products, which is that, you know, in the past, like, these tools, you would at mention a bunch of things. Right? You'd be like, hey. Build me a button that looks like at this Yeah.
Guest 2
At dot and at dot. And we think that that is, like, what Google looked like in maybe 1999 or 2000 Wes you would actually tell it an and or or of a bunch of queries, and then you would also say what site to get the result from. And we think that that is basically something that we think is an ID pattern. We think what should probably be the case is you will write natural language. It should go out and do agentic retrieval, and we should actually be building out really good retrieval systems, which happy to talk about a little bit more to make it so that the user doesn't need to add a bunch of things, and it can actually understand the intent of the at a very deep level. Yep. Yeah. Let let's talk about that that because that's super interesting to me because
Wes Bos
like, I the steps of become using AI is like, oh, I use the tab thing. Yeah. And then, oh, yeah. Maybe I'll I'll highlight some text and ask it to fix something. And then you oh, there's this sidebar where I can ask it to to do multiple things, create multiple files, edit edit all these things. You know? Like, what I've been doing a lot lately is, yeah, like, doing that tagging. You know? Like, put these files into context. But you're saying that, eventually, we just won't need to do that? Like, it it would just be, like, a Google search Wes it'll just know what we want? How do you how do you do that?
Guest 2
Actually, most of the people at our company don't mention anything. Right? Like, you should be able to use the fact that if you Node, a particular part of the repository with either clever indexing or the fact that we have good retrieval. And some of our retrieval techniques are pretty pretty high powered. It actually goes through your entire code Bos with an LLM and chunks it up and then validates what sections are the Wes. Should be able to find the best snippets that you should be operating on top of kind of like the way Kevin said. Right? Like, if you wanna install a package, you shouldn't tell it where the installation file is. Actually, you should probably be able to find out where the installation packages,
Guest 1
run it for you and then deploy the app for you. Mhmm. Or or if you think about a page in Next. Js, if you're looking at a contact page, we can resolve the fact that it's page dot TSX inside of, you know, probably slash contact. And if you think about the way the human would resolve this sort of query, they would probably command shift f and Versus Node, and they'd grep for contact and look for files that are named kind of loosely to that. And so this is sort of what we're doing behind the scenes. It's not you don't need to explicitly
Scott Tolinski
late the file that you're referencing. It's just they don't kinda figure it out. I know what a lot of other editor in Copilot on Versus Code just added this. Cursor has this this idea of, like, a a rules file or a, like, a code Bos, code style file. Does WinSurf have an equivalent of that, or is what you're doing behind the scenes take the place of something like that?
Guest 2
So we actually do have, like, kind of a rules approach. We actually call it this idea of memories, which is this idea of you can actually have provide the system with some form of memories. We're very soon gonna launch this approach of memories actually being dynamic. So the AI itself can write new memories, so it's not just things that the user writes themselves. But the main reason why you would wanna use something like that, and this is actually just valuable to have this idea of a rule JS, let's say, I write a different language or, like, I'm a person that's fluent in Turkish. I probably want the AI to be talking to me in Turkish, and, hopefully, every time I don't need to tell it, talk to me in Turkish. Right? Like, that's a that's a waste of time or if it talks in English every time. So for those things, yeah, we do have the ability to have these global rule sets on how you like to build apps, internally. And we think that that's generally useful, because it just reduces the number of, like, boilerplate messages that you need to send to the AI, which is which is a good thing. And so those are a markdown file. Is that I know I believe in Copilot, it being a markdown file, it gives you, like,
Scott Tolinski
heading, like, importance. I don't know if that's true, but it does being a markdown file do anything special or suggest markdown as a text file that's
Guest 1
For memory specifically?
Scott Tolinski
Yeah. Yeah.
Guest 2
Yeah. I don't I I think all of these things make it more human readable is probably what's valuable. But I think to the AI, it doesn't matter if it's, like, markdown versus a bunch of bullet points. I think the AI is gonna consume it, like, the same way. Yeah. It knows.
Wes Bos
Alright. Let's talk about if we're cooked or not.
Wes Bos
And this is something that we've I've been talking to a lot of people recently, and you seem to find people who are are on one side where they're like, no. It it it makes mistakes. You Node? And I think those people are are kinda not totally understanding how good this stuff actually is, and how good it will be. And then on the flip side as well, you get these these folks who have nothing in their GitHub and and think that, like, I'm never gonna write a line of code ever again.
Wes Bos
I'm somewhere in the middle. I've never argued against something I'm so excited for and in favor of in my life. Often, people put me as, like, the anti AI guy on on Twitter, which is hilarious because I I use it literally all day long, but I'm trying to just have some voice of reason here. But I'm curious. With the caveat of knowing that you guys have raised a bunch of money to
Guest 2
do the next gen of of dev with AI, What does the future of being a developer look like with AI? I think it's kind of like what you said. There's, like, what does the Internet like? The Internet likes very polarizing thoughts. Right? Like, no one no one's cool for saying the thing that is, like, normie. So you're gonna have a bunch of Luddites that think AI is not gonna be useful.
Guest 2
Very clearly, that seems like a bad opinion to have, especially when a technology is getting so much better year over year. And then you have a people that kind of like what you're saying. Maybe they have a hatred towards developers or a larger thinking. They like to get clicks. And for them, they're like software development is done. Right? Like, you you're gonna get those 2 people and they'll get the most response. It's kind of exactly what you said. It's in between. I think that software development, maybe this is gonna be the 1 hot take I have. I think a year or 2 from now, if you're not using a tool like Windsurf, I think you were just, like, significantly less productive as a developer. Mhmm. So I think what actually ends up happening is the number of developers does go up. And I'll tell you the crazy stat about this. When we launched Wind Surf, before that, we had everyone in our company build an app using Wind Surf, and they include our go to market team, salespeople that have never written a line of software. And instead of going out and buying these trivial tools, and I'm not saying expensive tools, to do basic things like quoting the pricing of our software, they actually went out and built the tool themselves. So for these very quick stand alone apps, they're actually going out and building things. I'm not saying they're gonna become Wes are the future of developers, but I think the aperture of what is possible for software development will grow in the next 10 years. And I think the number of developers will go up. But I think if a developer doesn't use these tools, they will be substantially less productive. I think that's what's gonna happen, and they will actually not be able to be competitive with developers other developers out there. But I think the market is gonna grow. I think it'll grow tremendously.
Wes Bos
So you think it's gonna grow. So what does software look like then? Because if we're thinking, like like, even yesterday, I needed to download every single episode of a podcast from an RSS feed and cache it and skip the download if I needed for my daughter. She wanted me to download it and put it on, like, her little music player.
Wes Bos
And I I just typed in a little box what I wanted it to do. And and 200 lines of code, it it worked first try. And I was just thinking, like, that probably would take me 2 hours if it wasn't this. So, like, if we're so much more productive Yeah. What does software look like? Are we just going to take Fridays off, or, like, are we gonna get paid less? Are just regular people going to be able to build this stuff for us? Okay. That's a that's a that's an interesting question. My viewpoint on this,
Guest 2
for these simple kind of apps, I kind of feel like there's not gonna be a tremendous amount of, like, economic value that's gonna be there. There will be a lot of personal value, but in terms of economic value, because this technology has become democratized, you are not gonna be able to sell that podcast downloader to a bunch of people for, like, 1,000 of dollars. It's exactly what happened. That market has just disappeared in some ways. But, hey, like, now you can do it for your own personal stuff way more efficiently, and you don't need to go on on Fiverr or whatever to go in case to go out and build it. But I think here's my take on it for the companies because that is where most developers are, ultimately speaking. Most developers are not hobbyists. They are people that work at companies.
Guest 2
I don't look at it as, hey. Like, these companies are nice entities that are just gonna pay their developers the same amount and ask them to work less. No. Actually, like, if you were to look at it from, like, a purely capitalist standpoint, they have a better return on investment now on investing in software. Their given dollar now is giving them more technology.
Guest 2
Then the question becomes, for this business, what is the ceiling of the amount of technology they can create and get value from? And I think for most companies that are large, there's a lot of technology that can be built to actually go in and optimize the business. I'll even give you a dumb example here. I think someone asked me a question. They were like, hey. What about a company like Safeway? You know? Probably, there's not much more they could do. And I think, hey. That's, like, maybe a lack of imagination. What happens if they went out and actually built robotic software that actually went out and did robotics within the store? There's a massive, massive ceiling on what technology can sort of drive. And I think if you go to every company and you ask them what is all the technology you wanna go out and build, it is such a long list that they basically don't even look at a lot of the stuff out there because they just know that they can't build it. So I actually think what happens in the short term maybe maybe you go to the full AGI thing. I I don't particularly wanna give a thought about, like, full AGI, out there. But if you look at the at the medium term out there, these businesses are gonna see there's an investment opportunity that has a higher return on investment, and they will go out and invest more in that opportunity. That's what the that's what the rational belief should be Mhmm. If that makes sense. Yeah. Yeah. It does. And, you know, you know, maybe, I don't know if you saw it going around the USPS website. Their alert system is just somebody
Scott Tolinski
commenting and uncommenting different alerts out in an HTML file. So maybe, websites like that will actually be able to build a real alert system at the end of the day because their developers will have to spend less time on it. But I I've noticed that entirely with with myself as well. I'm I'm not having the editor replace me or, the tools replace me. Right now, I'm able to accomplish flat out more things, faster and less time. And, we've seen it firsthand over here at Syntax. Our producer, Randy, he created a a smile detection app by himself, and he's never made anything.
Scott Tolinski
So it's like, people can just make stuff. Like you said, it's it's the tools that you maybe would have made and sold and whatever. Yeah. That stuff is going away.
Scott Tolinski
You can't necessarily do that. But just the amount of things that we can create personally
Wes Bos
will increase the productivity like crazy on a a personal level. Like, the bar for features and software too. Like, I I use FreshBooks, and they don't have AI categorization pnpm my expenses. Like, I had to click on every single expense and categorize it. And I'm like, I'm at a point now where I'm, like first of all, I built it myself because I Yeah. Just use the API. But, like, I'm pissed that they don't have that feature. Like, that's you could just you can build that now and and now the bar for software. So, like, I'm in this spot where I'm thinking, like, the software will just get better. You know? It's not like we're just gonna sit down and twiddle our thumbs. We're like, well, we've built the CRUD apps now. Software JS gonna is gonna keep getting better. We're gonna keep building stuff. Like, we're we're hungry for apps that will do what it is that we want. So
Guest 2
The expectation will keep going up. It's like when you look at enterprise software from the early 2000, it is hideous. It is actually hideous. It's like some Java ESLint stuff that you and I would never wanna touch anymore unless we were forced to and our company forced us to and we worked at a big company. And such a product is not gonna be able to get mainstream sort of adoption now, and that's that's a good thing. Right? That's a good thing. What about larger
Wes Bos
code bases for this type of stuff? Because it's all fun to make a podcast downloader and and whatnot. But, like like, let's say you open up, like, Versus Code. That's a massive, massive project.
Wes Bos
How good is the AI at doing it right now with the sort of limitations of of context windows?
Guest 1
Quite good. We built Wind Surf with Wind Surf. Sorry. It's the 1st version that was available internally. Everyone switched over, and we started building new features using Wind Surf, and that's currently you know, I probably executed I've opened 4 PRs this morning. Probably 50% of the code at least is written by WinSurf, and at least it tells me what files to change. Right? So if I'm gonna make a change that covers client server middleware, it's gonna tell me exactly where to go, if not write the code itself.
Guest 1
And I think it just goes back to it's not it's not really about the the context window anymore. It's not how many tokens you can kind of shove in at any given time. It's a bit more, like, what was the research done by the agent to figure out the steps needed to solve your feature to create your feature or solve your problem? The agent is able to to reason and through these, like, massive 100,000 line repositories like Versus Node. Maybe to put it in concrete, like, there's there's a architecture pattern in in Versus Code. So you have to create, like, an API that goes across different threads, and, you know, the Versus Code team has thought a lot about this sort of thing. And, you know, we we're kinda hopping in on their code base and trying to figure out, alright. If we wanna add a button here, if we wanted to add a state variable here, how would we do that? You Node, the agent is not gonna just ingest every single file that is relevant, but instead, it's going to say, alright. I looked at this this thread. This is what I kind of gained and learned about this particular part of the system.
Guest 1
Now I need to go learn about system b, synthesize, summarize, understand system c, and then it can kind of map reduce it to come back all 3 at at one time and then decide. Alright. I'm gonna make these changes.
Guest 1
So it is able to accomplish quite a bit more. It's no longer capped by just simply how many tokens in you can put it on. I think that's what's really cool about the way that the AI industry has moved, it's like not it's becoming a bit more sophisticated, a bit way more powerful because of this shift.
Guest 2
Yeah. Maybe the intuition that Kevin just brought up JS, like, because you don't have just 1 shot where an LOM just outputs everything, it's not actually true that you need to pass the entire code Bos into the system. Right? It's more similar to, like, hey. When you solve a problem about a large code base and you maybe contribute something to Linux, you don't know everything about all of Linux. Right? But if I ask you to explain your change and why you made it, you could probably do the explanation for that and the Yeah. Your your process of searching through the code Bos in less than a 100,000 tokens, which is, like, 10,000 lines of code. Right? You could explain that to me, hopefully. Right? Unless it's like a mega complex PR. Right? You could probably explain that to me. So, basically, once you say you don't have just, like, 1 shot at that and you can actually go and run code and you can search through Node and do a bunch of stuff, It actually changes maybe flips the paradigm a little bit from what these early chat products were doing Wes all they were doing was talking about, like, ever larger context windows. Yeah. An example of this is, maybe the unit test generation
Guest 1
with kind of a one shot that you were talking about before. You have 1 chance to write the perfect unit test. Yeah. It would probably, you know, not live up to most people's expectations and certainly not in the largest enterprises. But now with the ability to actually execute said test to first think about test coverage, then run your code, then figure out what's going wrong, and then maybe tweak some parts about the test, maybe tweak some parts about the actual source code, you have many more chances, like Varun said, to be at bat and figure out what's going what's going wrong and and how to make things better.
Scott Tolinski
And if you want to see all of the errors in your application, you'll want to check out Sentry at century.i0forward/syntax.
Scott Tolinski
You don't want a production application out there that, well, you have no visibility into in case something is blowing up, and you might not even know it. So head on to reduce entry Scott I o forward slash syntax. Again, we've been using this tool for a long time, and it totally rules. Alright.
Scott Tolinski
So I I know a a big limitation of when libraries or platforms make a major change in their code base is that the AI tools all suddenly are giving you the last version of stuff.
Scott Tolinski
So, like, me personally, you know, Svelte 4 went to Svelte 5. They changed a whole bunch of stuff. Now every single time I do anything in any AI system, I have to urge it several times. Please Scott giving me this old syntax. How do you stay on top of those types of situations given where the LLMs are at? Yeah. I've I've I've got some thoughts here. I think there's a combination of, like,
Guest 1
the recency of the of the data. Not not the necessarily the training data, though Node iterations and, like, training data getting updated is is certainly part of the equation. But simply the ability to expose the web and expose, like, the most recent information, that's gonna be a key part of this. And then the 2nd interesting part that you mentioned, it sounds like your frustration is that you have to urge the AI to to adhere to rules that you've told it many times. Right? And even the even the youngest toddlers, hopefully, are reasonable at listening a second time, and I think that's where the concept of the Yeah. I think that might be a little optimistic.
Guest 1
It's really yeah. Yeah. I guess.
Guest 1
But it's this kind of, like, marriage of those 2 ideas where okay. If you're using a new, a new library, a new version, like, we're doing an an internal upgrade to React nineteen, and we are internally using, like, web search to do this. And you'll just say, I'm using React nineteen. Now that's logged in memory automatically.
Guest 1
That's not something that you have to put in the rules or or yell at continuously every time you wanna send a message, but now it just kind of knows we're gonna use React nineteen.
Guest 1
It's able to retrieve kind of the change log, the release notes, and the breaking changes, and it'll recognize, okay. Here's the the differences that I need to to have in order to approach what I'm doing right now. And in this way, it can kinda passively meet you where you want to be as opposed to explicitly every single time telling it to do the same thing.
Guest 1
I don't know if there's anything you wanna add on to that. No. I think that's that's huge. I think also the other pieces
Guest 2
because of the system being able to actually look at the code Bos. Right? Not only at the code that exists, but at the packages, the dependencies, it actually just makes it more sophisticated and what it believes is the state of the code base. Right? I don't think this after doing that analysis, it'll think you want,
Guest 1
like, an older version of Svelte, right, internally. But we also think there are more robust ways to get this rather than just the the AI find it. Like, I think what Kevin said with the idea of memories is is a huge huge sort of approach that we could take. Yeah. Maybe one other interesting thing that you could talk about here is, like, the notion of a, like, a trajectory, kind of understanding what someone has been doing. Yeah. I'll give you an example. If you were to to do an upgrade, right, you would start making changes. You might modify your package JSON to update the the version number, then you'll start actually ripping out certain functions and replacing the functionality.
Guest 1
And what's really powerful about the fact that we now own the editor, and this is one of the reasons why I want to own the editor, the agent behaves very, very well when it knows your intent, right, as Varun mentioned before.
Guest 1
And the way we understand and build up that that intent is by looking at your recent actions.
Guest 1
So if you are making a set of changes and there's a clear pattern in those changes Mhmm. Everything you do in Wind Surf will now recognize that you're on this trajectory. We understand that you're doing the type of work that like, the the the upgrade, for example. And so your next chat message, your next autocomplete,
Guest 2
everything else will then take that into account going forward. So it is it's like this idea of, like, learning over time. Wes of, yeah, one of the cool properties about that is if you actually make a change in WinServ to, like, 1 file, and that actually is, like, the first of maybe 20 file changes, you can just tell WinServ cascade continue, and it'll just go out and make the changes on the remainder. Because we are actually, like, passing in. We actually know at a granular level the changes that the user is making. And that's why, like, we think, you know, this notion of an agent is a term that is thrown out a Scott. But I think the dream here is actually the human and the AI are operating on the same version of where they think the software is going. Right? It's not just like something that comes back to you with a bunch of code and vomits it out. It's like something that you had control over, and we've tried to build this throughout the product. Switching a little bit, I just wanna talk about the design of Wind Surf is is it's obviously just the Versus Node fork, but there's
Wes Bos
there's just like a little bit of added to it. You know? Like, the the command palette pops up in the middle. The tabs are a little bit different on this or on the side. Is that something you plan on ongoing a little bit more? Because it's kind of nice when the editor looks a little fresh.
Guest 1
I'm glad you noticed these things.
Guest 1
The team will be very happy to notice or to know that you Node some of the things. I mean, we spent a decent amount of time thinking about, like, typography, for example. Yeah. There's subtle changes in, like, the tabs that you noticed in the terminal.
Guest 1
We did move the command pallet to the center because we wanted to to emphasize.
Guest 1
We're treating it as almost like an everything bar type of pattern. So we made some, like, opinionated UX, UI decisions, some of which just simply feel like something is new, something is fresh. And and this is where, you know, maybe it doesn't offer any sort of performance gain or feature improvement, but it's part of the story that we're kind of creating something new. And when people want to they they wanna feel like they're using something new. And so that was part of the calculus there. And then there are some things that just simply demand a new pattern. Right? If you're making a bunch of changes across a bunch of different files, you need some way to step through those files, so that's where the out the the cascade action bar comes into play. Check, reject, all those sorts of things. Mhmm. And a lot of these are kind of they're inspired by by other products, and, obviously, we try and stick to sort of the Versus Node design because that's what people are used to. But we did give it a lot of thought to try and figure out, alright. What are the places that we wanna add a little bit of spice? Yeah. It's it's much appreciated. It's it's funny how
Wes Bos
you oh, I always just want a little bit new. Like, even changing my tabs to a custom font via, like, some CSS hacks JS is really nice. It just makes it feel fresh.
Guest 1
Fresh. Yeah. Yeah. Yeah.
Guest 2
For each of the new features, there's, like, a graveyard of, like, 5 that we, like, killed inside inside the company. But I think that's, like, the nature of how things go. Like, you know, we launch everything internally to our developers and, like, we are also very opinionated.
Guest 2
But sometimes you get you get maybe an idea that a bunch of people may not have been completely on board with, but you just kinda go with it. Sometimes you you need to be a little bold. And do you keep do you keep it, like, up to date with with Versus Code? Like, how how does that work when you fork Versus Code, but you still want their updates?
Guest 1
Yeah. It's a it's a rather complex merge conflict.
Guest 1
Maybe nightmare is too strong of a word.
Guest 1
You can imagine Wes Node a decent number of changes to the, actually, the the core of Versus Node. And so when, you know, Versus Code releases version 1.9, we will be right there ready to merge that branch into ours and do this resolution so that we could figure out, alright. Let's keep all the things that we've done, but also, you know, we should benefit from the great work that the Versus Code team does to just create a good Node platform. Mhmm. So it it's ours. I I would say it's it's manpower.
Wes Bos
You Node use do you use Wind Surf for that?
Guest 1
Yeah. Yeah. Yeah. Wind Surf is very good at synthesizing this information, to be honest.
Guest 2
You know, one of the interesting things, maybe a couple examples of this ESLint the of other companies that do things like this are they're much harder, by the way. Significantly harder is Google has their own version of Linux. Right? And they're constantly, like, patching Linux in. As Linux new versions come in, they're constantly making changes. I think the difference for us in our space that's maybe a little true is I I don't know. A lot of the core functionality for Versus Code I don't know if you feel this, but, like, what has really changed for you in the last year for pure Versus Code? Don't include other extensions, but just pure Versus Code. Yeah. It's all Copilot stuff. It's it's nothing in the UI, and Microsoft, in particular, has never excelled in the UI department. I mean, that was this big sacrifice coming from
Scott Tolinski
Adam, editor, beforehand. It was like, alright. You're going to Versus Node, and it's not it doesn't look as nice, but, it does more. You know? So,
Guest 2
yeah, they've never really been tweaking too much around the general UI, not even aesthetically, but features wise as well. Yeah. Which is why, like, the urgency for just everything that they have and a lot of stuff that they have is is not ridiculously high. And we are, like, constantly on, you know, trying to see, like, hey. Are there meaningful things that they're doing? But I will say just in general, when it comes to, like, the CoPilot UI UX, I I guess, like, we feel, like, a little bit opinionated on on how that UI UX should be. So we're not, like, the most excited to take, like, the Copilot UI UX that they have. I think, like, some of it Wes would not do ourselves.
Guest 2
So I guess, like, that's the way we sort of treat it, but it is like an exercise inside the company, but not something that other companies don't do for other large, like, open source projects.
Wes Bos
Totally. Yeah. Do you guys have, like, a a back channel of Versus Node, like a Telegram group of people who have forked Versus Code? Because I was thinking about it. I was like, there's probably a dozens of companies that have forked Versus Code.
Wes Bos
Maybe not dozens, but, like like, there's several companies that are, like, CodeSandbox are running it online, and then there's there's several that are Yarn running it as desktop applications. And then there's several that are using it in their, like, product as, like, an editor, or maybe it's just Monaco. And I was like, man, there's there's gotta be some sorta, like, back channel group of people who are like, hey. Has anyone ever dealt with this?
Guest 2
We gotta we gotta make one. But, interestingly, the the craziest one going back to Google, Google even has their own fork of Versus Code inside of Google.
Guest 2
Mhmm. Really? Not not IDX, but their own? My understanding is IDX is not being used internally. I mean, this is the classic thing of, like, how a lot of Google products, like, they just push out, but they're not using the same one that they're using internally.
Wes Bos
That's cool.
Wes Bos
Just like the future of AI, do you think that there's going to be any, like, major leaps and bounds? The last couple years have been nuts in terms of AI in general, not just coding, but the leaps and bounds we've seen. You know? Like, we went from, look at this thing, can't do a hand, can't write a word, to, oh my gosh, they're making commercials with this.
Wes Bos
You're obviously in the space. Do you see that petering out or or or increasing?
Guest 2
You know, I think I think the way Wes, you know, we sort of think about it are what people do is they look at, like, ideas, and are they tapering off? And I think one of the ideas that people thought warp tapering off was this idea of pretraining.
Guest 2
The idea of pretraining is we're gonna make ever larger and larger models that have more and more compute, and that idea was petering off. And it makes sense that that idea, even if it doesn't peter off, can't grow, like, by a 100 x every year. You you cannot build, like, a 50 times larger data center, like, every year. That would be very hard, at some point. Right? Because it just from a power consumption standpoint. But human ingenuity is, like, very deep. Right? And people want to solve this problem interestingly. And there are many axes to make these products better. Right? Like, using different types of tool. Right? Multistep ways of of using the product. Even given the same model size is unlocking a lot of new value in. You see people like OpenAI and probably many others will follow suit at this idea of, like, test and compute. What happens if you let the model think to itself? Granted, I think there's some trade offs to that too. If I I'm not that excited about a model thinking it's to itself for 10 minutes and giving me a response, if it could have just ran some commands and then given it to me in, like, 10 seconds, Right? Like, I would rather prefer that outcome. But Wes guess maybe my point is there are so many axis by which these products can get better that we will basically see, like, exponential improvements because of each axis getting better. Like, okay. You know, you'll see 1 axis of I let the model run for longer and it and it's smarter. Okay. People will get better at that. They will make it so that it doesn't need to run for as long and it'll get better. People it'll get better at using arbitrary tools. It'll get better at using the context link that it already has. There will be a much more efficient way of using a lot more context. Right now, it's used context. You know, most of the approaches use this approach called attention, right, which is a quadratic algorithm. So what that means is if you have a 1000000 tokens of context versus a 1000000000, you are now plugging in a 1000000 times the amount of compute. So, you know, this answer probably doesn't sound clean, but the answer really is it's gonna be stocking up wins in a bunch of different stocks all the way down to NVIDIA. Right? NVIDIA is now making their data center GPUs more efficient. And all of these approaches are gonna give us a different type of improvement, right, ultimately, that is gonna feel like it is an exponential improvement year over year. Wes. Singular approaches may hit a wall, but then we will come up with new approaches, internally. So I'm I'm actually very optimistic about this. I'm not of the opinion of, hey. This approach cannot scale anymore.
Guest 2
I think that just indicates to us that we don't have enough ingenuity to think of other ways to improve improve the the models and product. Yeah. And even on top of that, you know, it does still feel like super early days for
Scott Tolinski
interface in how we interface with these tools in general. I think we're all still learning how to interface with them and, finding new inventive ways to do that. I mean, even with Copilot workspace, that's an entirely different way of handling, PRs than any, tool before it kind of was like, alright. We're gonna we're gonna build this right in right into GitHub. And I think those types of though those types of approaches, I think, are only gonna continue and increase as we get more and more comfortable using these things.
Guest 2
I totally agree with you. And in fact, I think the best example of this is ChargeGbt.
Guest 2
I was I was actually talking to to to Sam at some point from OpenAI and asked him, like, hey. Why did it take so long for you guys to come out with the for the world to come out with ChargeGbt given the fact that these instruction based models had been things that Google and OpenAI had been noodling around with for over a year or 2. And it's kind of just the right interface popped up, which is crazy to think about. Right? Mhmm. It's crazy to think about. There's also just, like, this massive latent potential for these products that is just much more than just what the raw model capability is. But that's just software in general. Right? In general, every piece of software is a wrapper around, you know, an e w AWS EC 2 instance. Right? So, it's, like, up to people to figure out ways to build better software around that. Is there any stuff, like, in your labs that you just
Wes Bos
like, is it part of the company that just tries wild things? Because, like, yeah, we Wes don't know what that next killer way of of approaching this is.
Guest 2
I think one of the mottos we have, and Kevin can talk about his team and some product work, is, like, we should be failing probably over 50% of the things we do. Okay. That's Scott, like, some some glorious statement for the company. I think it's, like, you know, 3 things. I think we hire really great people at the company, but we are not omniscient. Right? I don't think smart people have a monopoly over being right always. In fact, we warp probably wrong a lot of the time. And I think if we are not wrong a lot, that probably means we're not trying enough ambitious things. We're doing things once they're too obvious. Yeah. Right? Mhmm. Or we're unwilling to change our mind on the ideas that are clearly bad, right, which is an even worse thing. That spells imminent depth to a Scott up. So I think I think we should be willing to do a lot of things and a lot of things to fail. Like, we had Cascade was built on top of research that we're doing inside the company for many, many quarters, but we just couldn't crack it to make it feel like a good enough product until very recently. Yeah. I think one of the big strengths of ours that Varun touched on earlier is that our internal team is incredibly skeptical. We are all technology optimists in the sense that we, you know, we we believe that technology is gonna get better. We believe AI is gonna get better, but we're incredibly skeptical about how these things actually fit into our lives. Right? It's the, oh, this is something cool, but,
Guest 1
really, is it actually gonna help me every day? And it's not until it really hits that threshold of, okay. I'm using this, like, multiple times a day that we really okay. We're gonna put this into production. We're gonna start treating this with with seriousness because, like, you know, like we were talking about, there's so many cool things and so many cool directions that you could take something, but you have to be very, very realistic about just how helpful it actually is versus kind of trendy or flashy.
Scott Tolinski
Yeah. I could imagine you could you could find something to be cool.
Scott Tolinski
You could fall in love with it and then just not realize, oh, wait. Never mind. This isn't actually
Wes Bos
gonna get picked up. Yeah. Totally. Totally. Can we talk about pricing? Because I know Codium has been been has free forever, plan. I know there's a free version of Wind Surf. And then I know you launched at $10 assuming nobody in the company has ever hit the the limit. And then the like, within a week, there was people, like, blowing through it, using it like like, it's amazing the the power users go through it. So now it's it's $15 a month. How does that how does that break down? Wes does that get you? I I could I could tell you if you want. Sure. Yeah. Okay. So, basically, I think I think we have there are many types of users that use the product. Right? Some people actually are using the product who are extremely nontechnical.
Guest 2
Right? For them, this is the only way that the interface with building applications. They don't even touch the IDE to write code, which is foreign to all of us in our company. Right? That's foreign to us. Right? Because they're building these apps and probably very quickly, you know, sort of using them for what they want and maybe moving on to the next thing. Right? Yep. And when we originally built the product, we had some number of credits we were willing to give. Wes should realize our average user uses significantly more than that number of credits. So we had to increase the price, and we also doubled the number of credits. So Okay. Compared to what we what we had before. Right? So it is not a cheap product for us to run, by any stretch of imagination.
Guest 2
But despite that, people are also blowing through the amount of usage right now for Mhmm. For the $15 product as well. In fact, like, our usage is so high, like, you know, we've hit rate limits at the highest of highest levels for, like, providers everywhere, right now. I think, like, for us, we are not the type of company that really wants to take a lot of price. I know that that sounds, like, sketchy for a startup to roughly say, but the real reason is because we have, like, a real enterprise business. Like, you know, companies like JPMorgan Chase and Dell have 1,000, close to 10 plus 1,000 developers, using the product right now internally. And, yeah, we even won, like, you know, JPMorgan Chase's Hall of Innovation warp. And that's where we see the sort of real profitable revenue for the business coming from, right, not from individual developers that really want to cut their teeth on AI tools. So there's, like, this fine line of, like, we cannot go bankrupt giving our product away, but also at the same time, like, we want to set, like, the lowest price possible that we Sanity get the most people using the product. I'd also say we've kind of positioned ourselves in such a way where, you know, the cost of intelligence is is getting cheaper and cheaper. And I think unanimously, like, if you look across the board, all the different model providers, every model is just getting better.
Guest 1
Mhmm. Every product is inheriting from that. And in the same way, like, we are positioning ourselves to kinda benefit from the industry trends. So the quality will continue to go up. The cost will continue to go down. The speed will continue to get faster. And, hopefully, you know, Wind Surf will inherit from these things, and that's those 3 axes are really what we're working around the clock to try and figure out. That's awesome.
Wes Bos
One more thing I wanna ask about was this whole idea of, like, running it in the browser, because it it seemed like for a while that and even, like, companies like Shopify, all their devs don't download an editor. You know? It's all cloud based. You know? You just go to a website. You log in. That's your whole editing experience for for everything, and that that cuts down all of their DevOps experience. I'm sure it's a big security thing. Right? You can pull the plug, and immediately, if someone's fired, they don't have access to to any of the code. It Scott of changed now that we've got all these AI things. Will that ever be a thing with Windsurf and and these editors Wes it will just run-in the browser or run on, like, a thin client somewhere?
Guest 2
We actually have no problem with that in the long term to basically provide a way to run-in the browser. There's no real sort of detriment. One of the weird things that we Scott of felt was, hey. If we need to make people download an app, maybe there would be some reluctance. I guess we were wrong. A lot of people are willing to download an app if they think the juice is worth the squeeze. And I guess one of the cool parts about the or the design of these kind of editors is they can connect to remote containers.
Guest 2
So, actually, a lot of companies, what they ultimately do is they use the dev container model, right, with remote SSH. The idea is you are connecting to another machine, and if you do cut off access to that machine, you do lose access to the code. So, look, if our users ultimately tell us, hey. We really, really want this product to run-in the browser, We're happy to do that, but it comes at the cost of some iteration velocity for us in the short term. But products like, you know, the the OSS version or CodeOSS, which is the base of Versus Code can run-in the browser, but that's largely because they are built on top of, you know, Electron and all these other primitives that make it easy to do so.
Guest 1
Awesome. It kind of has a column. You gotta you gotta be a little bit focused.
Guest 1
There's gonna be a lot of people asking for all sorts of things. Right? We're we've all been kind of the recipient of of users asking for features. Oh, yeah. Yeah. The Wes Surf came out, like, 2 months ago. And so we're got really intentional. You Node, our team is growing.
Guest 1
It's it's a cracked team, but, you know, we only have so many people. And so we're trying to we always have to be kind of ruthlessly prioritizing what to work on next.
Guest 1
And, you know, at at some point, maybe Wes is the place we want to extend to, but right now, we're pretty pretty locked in on on just the windsurfing
Wes Bos
things. And I appreciate that as well because you Scott think, like, rolling anything out in, like, Versus Node standard, probably, they have to think about every single possible person, every single type of developer. You know? Like, JS Python developers are gonna be okay with this feature because it does x, y, and z? Or what about when somebody doesn't have a sidebar available? Or what about when someone's popped out? There's so many different edge cases for that type of thing. And and as soon as you start to say, okay. Well, we'll also run it in the browser, and we'll also do this, then it gets gets slower to see new features and new improvements. And that's when our product starts to go,
Scott Tolinski
Yeah. Yeah.
Scott Tolinski
Usually, in this show at the end, we do sick picks and shameless plugs. Kevin, you've given a sick pick and a shameless plug before.
Guest 1
Would you both like to give a sick pick and shameless plug? I could go first because I've I've done this exercise before, and so I was on the train into work today. I was thinking about I definitely have a sick pick, recently. So as as you 2 might know, I I take a lot of photos. And so I recently treated myself to a new sick pick. It's called, the Ricoh GR 3 X. It's a point and shoot, shoots raw, has film simulations and the whole thing, and I've recently been taking that all around where I go to kinda get the the daily life street view film simulation type of look.
Guest 1
Yeah. And that's been a lot of fun for me. So that's my that would be my sick pick. Something along that line. You know, that doesn't have to be work related. Verun only works, so, you know, it's hard to Yeah. Right.
Guest 2
Yeah. Maybe maybe this is, this is this would have been a nice one if I had cycled as much, but I I really like if I could get be one of the cool kids that could get the automatic sort of gear changing, mechanism. The DI 2 on my on my cycle, that's like a that's a big one. Unfortunately, like, I don't have one. So so but that's, like, that's, like, the top of the pack that I That's a a sick wish.
Scott Tolinski
That's sick wish. Wish.
Wes Bos
Yeah. I've seen those often where there's no there's no cable from, like, the gear select to where the the derailleur is. And I often think, like, man, that's wild. And I've seen guys change batteries out on them while they're, like, on Tour de France.
Guest 2
Just it's crazy as well. It's crazy. You don't need to change the there's only 1 selector. So you it automatically cycles between the the big rig and the little rig automatically for you. It knows which one to sort of go on, and it's a smooth experience. Perfect. I guess the the the hard the hard part about all these things is the charging. It just feels like, like, I would just be worried about the charging the whole time. Like, when do I charge this? And and bicycles right now, like, you don't need to worry about any of this stuff. I guess you do need to do routine maintenance and stuff like that. So Sometimes. Yeah. So Yeah. I guess the battery introduction is always. I feel like there's that 1 person who at the start of the ride is like, shoot. I didn't charge my Yeah. Yeah. Yours. And it's like, Wes,
Wes Bos
you're actually just screwed. You're just screwed. You're just screwed at that point. Yeah. Yeah. Yeah. It doesn't it doesn't just, like, fall back. I'm sure you could push it over, but, oh, that that's crazy. I gotta I wanna try this. I wanna hack it. What is it? Is it Bluetooth? I'm curious what the It's Bluetooth. Technology is. Okay. Because I've I've been dipping into the Wes Bluetooth API
Guest 1
recently, which Because just WinSurprise.
Guest 2
That's my favorite one.
Wes Bos
Oh, yeah. I should Wind Surf it. Yeah. I wonder if it could. Because Chrome has a really good, like, about Bluetooth or something like that, and it will list all the devices that are within range.
Wes Bos
It's really cool because it will show you if there's AirTags and stuff around you, and you can see it so well. That's yeah. If you if you ever wanna know, like, what Bluetooth devices are near me and not just, like, what the computer thinks you should see, like, there's thousands of of devices that are constantly sending out Bluetooth
Guest 1
Mhmm. Radiation things.
Wes Bos
Yes. Exactly.
Guest 2
You can, like, locate these things too. Right? Because Bluetooth kind of helps you with location. Could you, like, just find where it is?
Wes Bos
Oh, man. Probably. It's it's all about That's fair. Like, sales strength. Right? Like, there's with the radios, there's a RSSI, and there's another one. It'll tell you the signal of the radio. So assuming, yeah, you could move around, and as the signal gets higher or lower, you could beep it. So maybe I'll try that. That feels like a very scary security risk,
Guest 2
to just go to an airport, and there's just someone doing this.
Wes Bos
Yes. Yeah. I I wonder. This the thing about it about being able to access the stuff on the Bluetooth is some of Bluetooth devices have, like, standardized things like get temperature and whatnot, and they'll publish what they are. But then a lot of them will just like, they don't tell you what the the calls are, so you have to sort of guess or reverse engineer what the actual calls are. We have a new baby, and he had this, like, foot monitor for a heart rate. I took that and hooked it up to the web and and put it on my thumb, and I could get my heart rate through the the little, it's like a smart sock that tells you the baby's heart rate and temperature while he's sleeping. Yeah. And, that's cool. Hooked it up. So yeah.
Wes Bos
Anyways Wow.
Wes Bos
Shameless plug. What would obviously, probably, you'd like to to shamelessly plug Wind Surf. Yeah. I think that's Yeah. That seems fair. It's where we spend most of our time, I would say.
Wes Bos
Cool. I bet.
Wes Bos
Yeah. So
Guest 1
you wanna do the official shameless plug statement?
Guest 2
Yeah. You guys should all use Wind Surf.
Wes Bos
Win dot surf. Yeah. Quite the pitch. Yeah. Win Scott surf. That's actually a good domain. Did you have to buy that, or was it available?
Guest 1
You know, it's available, but everything has a price. Yeah. Yeah.
Scott Tolinski
Did you it wasn't a friend.com
Guest 2
price or whatever that Wes. Yeah. No. We wouldn't make the news. That was we have not we have not paid the friend.com price. How much was that? Oh. Over the It was an outrageous amount. Yeah. Wes. Oh, oh, the friend.com? Yeah. How much was it? Node. How much was wins wind.serf? I wanna know that too. It's probably I mean, it was it was probably maybe 2 orders of magnitude cheaper probably. 8 point or whatever. The Rent.com prices. No. It was 1.9 mil no. It was
Scott Tolinski
$1,800,000.
Scott Tolinski
Okay.
Wes Bos
It's a yeah. It's it's a it's a lot of g p r a g. Oh, man. That's Yeah. One of my favorite things to do is ask people, what's the story behind the domain name? And almost nobody ever wants to tell us a story. They're all clammed up behind it, and they
Guest 2
I think it's because maybe sometimes the owner of the previous domain is is just like it was hard to reason about it, and it seems it's just a bad feeling overall at the very end Yeah. Potentially. Yeah. Yeah. It's true. Or or often, I think people buy them under mask.
Wes Bos
I had somebody negotiate with me back and forth for a long time on a domain name, and I finally sold it to them. A week later, boom. They put it up for sale 10 times the price. I sold this to them. And it was like they knew it was worth more, but they had, like, made a fake a whole fake story and everything behind it. And, I actually real sad. I forget what it was. I don't know if they ended up selling it or not. But Yeah. I hope they didn't. Yes.
Wes Bos
Right.
Wes Bos
Cool. Well, thank you so much, Kevin and Varun, for coming on. This was really fun, really interesting to learn all about it. Definitely going to I've been using OneServe. Gonna be continuing to to check it out with all the new features you guys are releasing.
Guest 1
Cool. Yeah. Thanks for having us, Scott. Wes, hopefully, we'll see you in person at a conference maybe soon. Oh, yeah. We'll be around. Definitely. Alright. Peace. Take care, guys. Thank you.