Did DeepSeek Just Break the AI industry?
en
January 31, 2025
TLDR: This week on Waveform, discussion revolves around Google and Samsung's new Android XR headset challenging Apple Vision Pro, updates on DeepSeek, Pebble smartwatch's return, and 'Something? Or Nothing?' segment. The episode concludes with trivia.

This week on the podcast, Marques and David dive into the explosive developments within the AI sector, particularly focusing on DeepSeek and its implications on the industry. They also touch upon new hardware advancements, including the Android XR headset, and provide a nostalgic update on the Pebble smartwatch.
Key Highlights from the Episode
1. DeepSeek’s Emerging Presence
- DeepSeek V3: This AI model was launched on Christmas day and costs a mere $5.6 million to train, significantly less than competitors like OpenAI.
- Performance Claims: It has demonstrated performance levels comparable to leading models such as GPT-4 from OpenAI, raising concerns among industry giants about competition.
- Stock Market Reaction: Nvidia lost approximately $600 billion in market capitalization due to the immediate effects of DeepSeek's advancements, showcasing the seriousness of this development.
- Future Potential: Marques and David discuss how DeepSeek’s efficiency might change the landscape for AI models, making advanced technology more accessible to smaller companies.
2. Android XR Headset
- Samsung’s Collaboration: The hosts explore an exclusive look at the new Android XR headset, created in partnership with Google.
- Initial Impressions: The headset displays impressive resolution and battery life, but they noted it still doesn't completely compete with Apple's Vision Pro in terms of pass-through display quality and immersive experience.
- Use Cases: Early users can smoothly run Play Store apps, with minimal adjustments needed for optimization, making it promising for developers and users alike.
3. The Resurgence of Pebble Smartwatches
- Open-Sourcing Pebble: The hosts reveal how Eric Mijikovsky, Pebble's founder, is rallying for the revival and open-sourcing of Pebble products after Google allowed the brand to remain dormant.
- Community Driven: This new venture aims to cater to users seeking simpler technology and tailored functionalities without the overwhelming features of modern smartwatches.
- Consumer Interest: Marques imagines a new Pebble device that embraces simplicity, strong battery life, and core smartwatch functionalities, filling a niche in today’s tech landscape.
Expert Opinions and Insights
- Market Disruption: The emergence of DeepSeek and the return of Pebble could significantly alter consumer engagement with AI technology and wearables, reminiscent of early Kickstarter projects.
- Long-Term Prospects: The discussion includes the long-term viability of these technologies, how they can shape user experiences, and perhaps lead to a resurgence in other innovative technologies that prioritize efficiency and accessibility.
Practical Applications and Takeaways
- DeepSeek’s Efficiency: Businesses should remain alert to DeepSeek’s capabilities as they could represent a shift towards more economical AI applications, potentially democratizing access to advanced AI tools.
- Investing in Wearables: The return of Pebble signifies a growing market for simpler wearables, encouraging new companies to rethink product design focusing on minimalism and functionality.
- Strategic Movements by Tech Giants: Awareness of how larger companies respond, such as adjustments to their pricing or innovation strategies, will be critical for tech consumers and investors alike going forward.
Conclusion
The evolving landscape of technology, marked by DeepSeek's innovations and the nostalgic Pebble smartwatch comeback, highlights a convergence of accessibility, functionality, and innovative design. As listeners, we should consider how these shifts might affect our choices in AI utilization and wearable technology moving forward.
Was this summary helpful?
What's up, y'all? It's Kenny Beecham. On this week's episode of Small Ball, we have one of my all-time favorite segments, the Kenny for Real All-Stars. You know which players have been selected to the 2025 All-Star game, but this list is different. The Kenny for All-Stars are my favorite role players throughout the league, unappreciated players that are genuine stars in their role. In this episode, I'll also share my honest thoughts on the Kings and what they should do about the Aaron Fox. You can watch Small Ball on YouTube or listen wherever you get your podcasts.
So the only things I've seen is, number one, it was an AI model that costs, like, a tenth of what OpenAI's models cost, and it was just good. Allegedly. Allegedly. And then, I heard that it was just trained on OpenAI's stuff, and it's not actually that impressive. Oh no, they stole our stolen content!
What is up, people of the internet? Welcome back to another episode of the waveform podcast. We're your host, I'm Marquez. And I'm David. And that means that Andrew's not here. Yeah. Andrew's out this week. Ells is also out. Adam's holding down the board. Is there any sleeping in the corner, like usual? It's going to be a good time. It's been a couple of weeks, so we do have a lot to talk about. We got an exclusive look at Samsung's Project Muhan headset. A lot of interesting thoughts there.
We're going to touch on the new Deepseek R1 model. That's kind of all of the rage in the AI industry right now, but also talk about Pebble, sort of a... Can I say nostalgia about Pebble? I think so. At this point, I feel like I can use that word. We'll talk Pebble for a bit, and also we're going to play another game of something or nothing at the end. Love it.
But first of all, I'm gonna give a shout out to a piece of content. So nothing, that's kind of great. Very headline, nothing, nothing. The company did a video on their YouTube channel. That was almost too smooth. So they basically emailed me a couple months ago and we're like, hey, we're gonna make a video on what it would take to make your dream phone. You know how you've had a dream phone in the past where you just Frankenstein together a bunch of parts. Just give us one of those and we'll make a video of it. And I was like,
All right, so I just threw together, you know, some specs and it'd be cool if you can combine these, sent it off. And then two months later, they were like, hey, we made the video. So I watched the video and it was pretty interesting. Basically, it's them breaking down roughly what the costs are of a smartphone. Most interestingly, the build of materials itself in different components of a phone, I go into actually building it.
So I'm sure you've heard before that like, yes, you buy a new phone and it costs $800 or something, but that phone only costs $243 of materials. And then everyone goes, well, they could have charged less or they could have, you know, here's all these other thoughts on the difference between bill materials and final retail price, but.
I gave them some specs and this is what they came up with. So go watch the video if you haven't already, but I'll just give you some top line level spoiler alert stuff that I think is interesting and maybe you have a reaction to this stuff. So I told them a huge battery. We rounded up to 6,000 milliamp hours, 65 watt fast charging, 15 watt wireless charging. They said $13 per battery. Your dream is 15 watt wireless charging? Well, I didn't specify actually. I just gave them, I said fast charging. Oh, okay. My dream would be,
Give me 100 watts of 1,100, 100 watts of both, but $13 per battery. Okay, which doesn't seem that terrible actually. I wanted a 6.1 inch, 120 hertz LTPO, AMOLED 1440p, 6.1 inch being on the small side, but everything else basically being like a high end.
S24 ultra level display. $35 per display. Okay. That's more than I expected. Yeah. I mean, this, I guess, includes the glass and the panel itself. Yeah. That's fair. 35 bucks. Triple cameras. I just handed them S24 ultra roughly, but also not really. I would like a further telephoto in the S24 ultra house, but the idea is triple cameras on the back and Pixel 9
Front facing camera and $80 total for all the cameras. Okay. $80 for the cameras. That seems, we got a bunch of sensors. Yeah, a bunch of sensors and then all the glass and optics and whatever has OIS in there. It seems crazy that it's that much higher than the display. I always figured the display would be.
just as expensive as the cameras. Yeah, I imagine the plastic actual lenses are not that expensive, but you never know. The sensors Sony has a monopoly on, so what are you gonna do? Fair, yeah. They're the Samsung, well, no, those are Samsung sensors. Samsung makes some of their own sensors. I forgot they're Samsung ISO sensors. Yeah, so I mean, I'm not tied to those particular sensors, but yeah, roughly 80 bucks. The storage, give me a terabyte of fast storage and 16 gigs of RAM, $90.
Oh, for that. What? Yeah. Really? Yeah. Oh, storage and the RAM. Storage and RAM. OK. $90. It still seems like a lot. It does. And whenever you go to upgrade, I guess it makes sense whenever you go to upgrade a phone and you try to double the storage, then you see real price increments. It's not like you have to pay $200.
and build materials to increase the storage. But it is a pretty significant cost in the phone. Yeah. So that's interesting to see. One terabyte on SSD is much cheaper than that. It is true. But, you know, it is fast. Yeah. It is a tiny little UFS 4.0 SSD. We'll go with it.
And then a Snapdragon 8 Elite, so again, they said they couldn't reveal their exact supplier costs, totally fairest, and the MSR Qualcomm doesn't let them, but they basically referenced that a typical high-end flagship chip will cost about $190. Wow. For them, on their volume levels. Dang.
So, pretty pricey. I actually have some more thoughts in my S24 Ultra review on the cost of the Snapdragon 8 Elite. We'll get there. Okay. But yeah, 190 bucks for the chip. And then some other small things, the motherboard, antenna, electronics, haptics, other small electronics, miscellaneous, $15. Okay. Packaging, cable, stickers, box, $30. Wow, that's a lot more than I thought.
Design materials, so if there's fancy stuff you want to do with the back glass or any decoration or maybe some special paint or something like that, $8. That would be why we see so many insane weird designs on the back of buttons. Not that expensive. You do Dragon Ball Z for $8. Exactly. Structural parts, buttons, mid-frame, screws, things like that, vapor chamber, even $10. Okay.
patents, licensing, software, $29 per unit. And then rough total bill of materials, again, this isn't all in cost to make the phone, but the bill of actual physical materials, $500. So that's for my like crazy, super high-end dream phone idea, $500 to build. But then of course you have to add on R&D, all the teams, the employee costs and licensing and things that actually
at one time engineering cost to make this book motherboard work with this RAM work with this storage all that stuff.
and then it breaks it down in detail in the video, but it might cost 20 to $25 million to develop a phone. That was a very funny part of the video, because they're like, all in all, it should be about $500, but R&D is about 20 million. Okay, so we got to sell a lot of these. You do have to sell a lot of phones to sort of break even, obviously. My favorite quote was they wrote as one of the costs, purchasing products from competitors to understand the market better.
And they didn't give an exact cost for that, but I thought that was interesting. It happens in a lot of industries and I just kind of forgot that that's a
thing you have to budget in. Like car companies just buy tons of other company's cars. And then drive them around and go, huh, I kinda like that. Definitely more expensive for car companies than smartphone manufacturers. Well, in total cost, but maybe per in development, relative to the final cost of the product. Maybe it's reasonably by five or six cars to make a car. You buy five or six phones to make a phone. So yeah, anyway, go check out that video. It's pretty interesting and insightful. I don't really see that sort of behind the curtain
number stuff very often from smartphone manufacturers. So we'll check, check that link in the description. Okay. But anyway, okay. So I think the big interesting thing we got to try this week with them could be HD video. We put out Project Muhan, Samsung's headset collaboration with Google and Android XR. I thought it was very interesting. Yeah, I have so many questions about this. I have all the answers. Okay. Yeah. So I tried to put as much as I officially know in the video. We don't know things like the exact release date or the exact price. We don't know
A couple final details on things like field of view and resolution. I can just tell you like what I thought looking into the headset, but most of what this experience was was me getting to try it and use Android XR and Gemini.
on a premium-ish Samsung built headset. And yeah, it looks a lot like a Vision Pro. They clearly were inspired by a lot of parts of the Vision Pro, but they did improve on some stuff, like the battery, even though it is a battery tied to the side and in your back pocket. It's a removable cable. It's USB Type-C. You can pop on another battery, which is pretty cool. Can you plug it into a wall socket? You can. Okay, cool.
Well, the, the battery that comes with it had another USB port on it that you can plug into the wall. So you can't put the cable directly into a, oh, um, that's a good question that I didn't get an answer to. I would imagine you can. Okay. But that didn't get an official because it'd be cool if you could plug it into like a laptop as well. And then you have lower latency of.
Some sort of screen expansion. True. That's a great point. Yeah. The cable's long enough to plug it into a laptop for sure. I don't know about a wall, but it's a cool idea. Okay. Yeah. I mean, they made this headset a while ago. They've been finalizing it. It was near final. So what we saw and what I put in the video is probably basically what you're going to get when you eventually get to buy one of these things and they'll name it and it'll come out and it'll be a
some unpacked event later this year, probably, but it's just going to ask if this is the name they're going with. It's probably not going to be called. Yeah. So muhan, Android XR. Oh, Android XR, the platform name. Yes. Google's Google. That is the name of the software officially. Yes. The muhan is a code name. It means infinity and it's a code for this infinite canvas that you get to play with, which is kind of cool, but it's not the name of the product. So is reality really an infinite canvas?
Uh, technically. Yeah. The universe is expanding, Mark has. It's an, it's an and expanding canvas. So questions. Yes. Okay. So something the vision pro does really well is that when you're looking through it, the frame rate is high enough that it doesn't really feel like you're looking through a screen.
It kind of just feels like you're walking around and the quality is a little bit worse than real life. How does that feel through this headset? Couple notes. One, the resolution of the display was pretty good. I would say a notch below Vision Pro, but it's pretty crisp.
The overall like pass through from the cameras was good enough. It wasn't stunning or amazing. Vision Pro is definitely better. Like when you look through Vision Pro and pass through, you feel very close to like just looking through like some blurry glass or something. This was a notch below that for sure. But they did something interesting, which is they let you take off the bottom of the eye seal. And that gives you this peripheral vision where light is coming in on the sides and the bottom.
And for some reason that made it feel better because your brain sort of fills in the bezels around the display with the rest. So like if I'm looking at a wall and I see through the middle, the pass through, and then my peripheral vision sees the outside of the wall, my brain kind of fills in the gaps, kind of merges them in a way, which is interesting. So I prefer using it without the light seal because of that, but generally decent.
B plus pass through, I would say. Okay. Yeah. I have a question. Yeah. So we actually just dropped today an Apple Vision Pro bonus episode. Go watch that if you haven't. Watch it. But one thing that the developers were telling us is that Apple did a good job at making Swift easy to code in for iOS apps. And when they released the new headset that you could just like kind of like know what you're doing over there. Yeah. Did they mention anything about that with the Android version? Are they kind of native coding? Yeah.
Yeah. So the details that I got from the team are number one, all Play Store apps are compatible. Off the bat, no extra code, they will work. Just like as a little window pop up. Yeah. So I went to the Play Store, I downloaded a regular phone app and you could just pull it up and use it like a normal phone app, change the aspect ratio, scroll around, it works.
Um, but there obviously would be a benefit to having optimized apps. So what they call spatial apps and a bunch of the built in apps are optimized and apparently it's a few lines of code to update your app to be optimized for spatial. And then you can go crazy, customizing and adding way more stuff, but like the YouTube app, for example, was a spatial app. Uh, it shows to curve the window a little bit. It added these floating side panels for related videos and comments. And then you could add a background.
for an environment that you're in, and then watch YouTube videos in this floating background on a mountain.
not as high fidelity as apples, just not as good looking, but cool feature. So my understanding is that it will be pretty easy to make spatial apps, but if you don't update your app at all, it will still work on the headset. Dude, this is the reason that they haven't made a YouTube app for Vision Pro because they want it to be exclusive to an Android XR headset. I wouldn't be shocked. I mean, that's a pretty huge thing, especially since they shut down the third party Juno player.
My question is, what's a better experience watching YouTube in the browser on Vision Pro or a built-in YouTube app on this Muhan headset? Right, because in Vision Pro Safari, they have a special window node for a video player now. So you can watch a video full screen with a background.
kind of exactly the same way you do on the Muhan headset. It just won't be curved. But you can change the size, make it huge, make it further away closer to you. It's similar. Generally, I feel like a player, like an app, is going to have better UX design as well than just the website in a window. The second you have to move around the actual browser, it's better on the Samsung headset.
better on Samsung heads though. Yeah, because I don't want to have to move around YouTube.com. Right. Oh. With my pinching and strolling. I hear you're saying. Yeah. Cool. That's an interesting thought. Okay. But I enjoyed that. Okay. Yeah. Let's see. Other questions. Do they have any sort of group stuff you could do? No, none of that was demoed. No. Okay. I would assume no, honestly. Yeah. I think it took Apple a while even to start building these like shared experiences and share play as like their magic
like, sauce that, like, works kind of cool with certain apps. I had no shared space experiences with, like, another person in a headset interacting with the same model as me. So no personas. No personas. Crazy person. That's a good question. I didn't make any video calls, so I don't know what that would look like to the person on the other side. Maybe they just haven't. They just see your eyes really up close. Yeah. Yeah, there is, I mean, I was in there playing with the headset for maybe two or three hours. I didn't get to do everything. Video calls was not something I got to.
But I did play around with some stuff that's not in the video like just web browsing and poking around the stock apps and things like that thrown apps all everywhere. Honestly my biggest takeaway was, damn, this might be the most useful thing I've done with Gemini. Just because you can interact with it with your voice. It can actually.
poke into the UI of what you're doing. And it seems like it can actually move into spatial apps and like use the UI for you inside of spatial apps. But I could ask Gemini to clean up my windows. And if I had six windows, it would line them all up next to next to each other. That was great. I could ask it to open an app and open Google Maps and show me this thing. And it would do that for me. If I was looking at something and pass through like a painting on a wall, I could say,
who painted this and it would just pull up a web browser and Google something like who painted this. So it was pretty like hands free. Obviously it still has eye tracking. It will still have controller support, but the complete lack of having to do anything with my hands.
and just talk to the thing was surprisingly nice. That was something I noticed for your video. So you said that you basically just have to put it on, do the eye alignment thing, which takes like a second, and it automatically does hand tracking. Yes, hand tracking is automatic. So you don't have to do the whole vision pro thing of like, look that way, look that way, look up, look down, put your hand on. Or eye tracking, you do have to do the looking at the dots to dial in eye tracking. So by default eye tracking was actually off and I was doing everything with my hands. And so I'd point
my pinched fingers at the subject and select things, which is a lot of arm movement. Obviously, in Vision Pro, you straightaway set up eye tracking, and you never really do that. You can do essentially the same exact setup. It's looking at a bunch of different dots around the screen, and then it's all right, all right, we could see your eyes now, and then you can do the same sort of stuff. Does the eye tracking feel as precise as on Vision Pro? Yes, yeah, yeah. It's the same magical look at something. It's highlighted instantly.
move around the UI with your eyes and you can see like things like highlighting as you look at them. Yeah, it's pretty good. Okay. So when Vision Pro came out, it definitely felt like a very magical type of technology. Do you think that this feels like it's exactly caught up? If the Vision Pro is an iPad Pro, the Project Mulhana headset is a Samsung Galaxy tab.
So if you love the magic of this is a super thin piece of hardware that is actually computer and is amazing that it can do all these things on display, you will feel the same thing with the Galaxy Tab. There are certain things that Apple does in their own ecosystem that are another level of magic on top of that, whether it's share play or shared experiences or whatever apps that you're using.
that will not quite be the same on the Google one, but Google has Gemini, Google has YouTube, Google Apps built in. It has that extra functionality to make up for.
maybe not being as magical. So it is very functional. There's no way it costs $3,500. I think this thing is gonna be 2K tops, probably less. And I think that's gonna be a big reason people will consider it over a Vision Pro. But yeah, not as magical. I think that's fair. Interesting. I guess the relevant comparison could be like chat GPT to Gemini where because Google has all of these Google apps that Gemini can tap into and it already has all your data, it's more useful than just like a platform.
in certain ways. Yeah. It's, and it's cool. And I still want to play around with it again and try things like, like searching inside of apps. Like I wonder if I could ask Gemini in the headset about an old email and I wonder if it would go into the email app and like search through that and find it. That would be amazing. Cause essentially it's just pulling up an instance of Gemini live, which is the conversational thing that's on your phone already. So I imagine whatever that can do.
this can do in the headset. And since it's multimodal, I can take a picture on my phone and it can search that. Well, the headset, it's running past through all the time. Gemini Live can see that. And so I can just ask about something I see. So there's a lot of potential, I think. And when you were talking to Google, did they have a specific positioning for why this is useful to use? Because Apple never really defined what it thought the Vision Pro was best suited for. It's true. And I feel like it tried to figure that out after it launched it.
This is great for media consumption, but also kind of work. And as they have updated Vision Pro, they've updated the categories that they found. People were using it for more, but it still isn't that defined. And I feel like that's part of the reason it is not doing well. Did Google have like a, we want people to use this for work. We want people to do this. Like, do they have a positioning for this thing? It was mostly centered around Gemini. I think if you ask them, they would mostly tell you like, this is the best way to use Gemini.
You could use it on your phone, you could use it on your tablet, or you could just use it on a pair of glasses or a headset that you're wearing all the time, and it's always able to talk to you. So I think their pitch would be, it's the best way to use Gemini. And that's, it reminds me of your AI pin, like the part of the video where you ask the iPad to name the car, and then you just pull out your S24 Ultra, and it does it faster. I think there's some instances where this is the best way to use Gemini.
In a world where you're walking outside with this headset on and a car passed in front of you, you literally look at the headset and go, what car? You just look at the car and go, what kind of car is that? And Gemini can see the car you're looking at and can tell you right away. I had a clip in the video where I just hold, I held up a book and the book had a picture and the bottom, it said like the name of the art, the name of the photo and where it was taken.
And all I did was pull up the picture, look at it and go, can you take me there? And it, it understood that context, understood that it was a location, opened Google Maps, searched that location and then took me to that location in Google Maps. So if it's able to do that sort of parsing and understanding and carry out actions into apps on my behalf, I think that's the pitch. That would be cool.
I would like to see what the limits of that are outside of a contained space. Totally. Also, do they have any sort of like workspace expansion stuff where they can expand your laptop screen or anything like that? Did not hear about that. Okay. Didn't hear about that. Interesting. I would be curious what happens if there's any screen mirroring or if we're plugging into a laptop, what happens? Because I didn't get to do that either. I want to see their first ad for this device because I need to know what they're trying to tell people to buy it for.
That's a great point. The ads say a lot about what they imagine people use it for. I did get to connect a Bluetooth mouse and keyboard. And there wasn't a computer also, so it was just me using the mouse and keyboard to move around Android XR apps.
which is cool. But yeah, quickly going from pinching something and then typing on a keyboard and then a mouse, all that was very fluid. Interesting. But yeah, I'm curious about that first ad too. Interesting. Yeah. I got two questions. One, were you able to look at a bright thing and then back at a not so bright thing? Yeah, good question. So like you said, this was a very controlled environment and
In the interest of getting as many shots for a video as possible, we basically stayed in this evenly well-lit room. There was a display in the room. I could look at it and I could look around very quickly and that seemed very fluid, but I didn't really.
I don't have any memory or footage of like looking at a super bright light and then something dark. And so I don't know if it would handle it the same way as Vision Pro. I kind of imagine it will though. Yeah. Just kind of trying to balance out exposure quickly and hopefully not peak. Yeah. Okay. Next question. Do you think that this is like from what you experienced and going back to what their first ad is going to be? Is this basically just a dev kit? Again, just like how the Apple Vision Pro was?
Yeah, it's basically a dev kit. I think that's accurate. Honestly, right now they are using this headset as a dev kit. And that's where you'll see it right now before it's shipped. But yeah, it feels like, okay, Vision Pro is out, Quest 3 is out. There are headsets that are out in the world. And if we want any sort of advantage or if we want people to use our platform, we need people to develop for it. We need to get something out in their hands.
This is the first headset that has Android XR, take it away developers, make awesome stuff. And then hopefully that leads them to making magical, amazing apps that make you want to use it. The Vision Pro problem still applies when it's Android. So I just really need to know what it applies, but it applies for the price.
Probably, hopefully less. Maybe less. Because even at 1750, this thing is way over. Because for a cheaper AR-VR experience, you get the meta. The meta quest. All the way down to 350, 300 bucks. Plus, if you can't plug this into a computer, Android apps are great. I'm sure you'll be able to narrow your Chrome tab or something. Because right now, the quest can already have your computer display. I feel like you need to have that feature in this headset.
But also if you're going to call, if you're going to charge more of the quest, then you have to be able to do more than the quests. And the quest has a huge library of games. That's like what they're really good at right now. And obviously the work aspect is okay. You can computer mirror. I would imagine that this kind of gets positioned as yes, you can work in it, but also you're, you're just going to get so much done with Jim and I all the time. It's going to be the super useful thing.
That helps you all. You're going to want to walk around the office and write emails while your hands are in your pockets. Yeah, but the problem is this thing is so big, just like the Apple vision pro, like how long is this in a pair of Ravens? Like if they're just going to put Gemini in glasses. Yes. Okay. Right. Great point. And I think this will be probably the last big point for this. Okay. Android XR as a platform.
can be in a bunch of different form factors. So in this video, I said that Project Muhan, think of it as like the pixel or the nexus or whatever of this platform, meaning that it's an example for a version of the hardware you can put it on.
But there will be, from other manufacturers, higher end versions, lower end versions, different form factors. I think maybe having a conversation about PCs, maybe was this last week on the podcast, but like desktop computers are still a big thing that you put on your desk and people will buy higher end components or lower end components, but they'll just run Windows again at the end of the day.
And so Android XR will run on smart glasses too. But if it's on smart glasses, you will have less total functionality. You can't watch a movie on them. You can't see the Gemini UI as easily. It'll just be text based instead of like this whole.
animation thing. So there are pros and cons to different form factors. But the idea is they're introducing Android XR and then hopefully the hardware ecosystem blooms and the developer ecosystem blooms and everything's blossoming at the same time. So more of instead of trying to crash the VR and the smart glasses into each other,
You're just saying, we're just going to run on all the types of platforms. Yeah, all the types of headsets. Exactly. Yeah. Android XR is the base software layer for all of the future AR-VR-XR experiences that will come in the form of headsets, classes, et cetera, from Google. Interesting.
I'm very interested in where this goes and how they position it and all this stuff. I did hear that they were planning to really sit in 2025, though. They wouldn't stop telling us that. They couldn't stop. Every time we asked about it, we're like, so this year, and then Andrew would ask, so actually this year, and then go, yeah, yeah, yeah, yeah, a couple months. I think you should have asked them where the galaxy home is.
And I feel more comfortable. I know, yeah, every company's got an air power that makes us go. Literally every company has announced something where we go, can we trust them anymore? Are they gonna make, even if they have a track record, you're like, yeah. But you didn't ship that one thing, so I don't know. That's true. So interesting. Yeah, Samsung, we'll see. 2025, allegedly. Okay, and nothing about price. Nothing, nothing about price. All of the price, all of my estimates would come from, like if I was betting,
We could probably estimate in an educated way. Like I looked at that display in the field of view and the materials and the battery. And I don't know the bill of materials, but I imagine it from Samsung, they would position this in a way that they could undercut Vision Pro. Yeah, but still be looked at as premium. Let's take a market that has very few people in it and make it even smaller. Well, Quest is popular.
That's true, but for gaming. You know what, that's four. Nobody knows what these other things are for. Well, hopefully, so I did play one game briefly. I don't remember. It was one of those. I'm going to launch this like set media. I forgot the name of this game. It was like a flat game with a house cross section with a bunch of flappy bird.
Oh, was it a fall out shelter? Yes, that. And I played that for 30 seconds. And I was like, wow, it works with my hand control. That's crazy where I could have a mouse. Did it feel intuitive at all? I don't know how to play that game. So I couldn't answer that. But yes, I mean, I zoomed in and I clicked things and I held things down and I moved them around. So yeah, I think he was pretty intuitive. Decided to play mobile games on a thing I have to wear on my face. I mean, there's going to be games that don't work at all.
Like I don't, I imagine like a need for speed games not going to probably just be in a window, right? You got to have controllers at that point. Yeah. I think you need controllers. Right. So is this built on Daydream or is this just a brand new situation? We don't talk about that. I don't think Daydream. I don't think Google even remembers. Do you remember cardboard? Yeah. I was under my seat. Cardboard was fire. Cardboard was a great idea. Just put your phone right on your face. I'm going to guess when this Samsung headset comes out, it's going to retail for
Nope, more. No, I was gonna say 1599 or 1699. I'm guessing like 699. Although it's 699. I think it'll be more than Quest, but it'll be way cheaper than Vision. Wait, 699? 799. 799? Yeah. Not 1799. I'm going with 1599. Ooh, okay. Like think about, like an S24 Ultra is already 1299.
Yeah. But everyone knows what to do with an S24. That's my thing. In order to get people to buy a thing, people buy the Quest, even though they're not totally sure what they're going to do with it, because it's 500 bucks and it's like two games they know they want to play. And during Christmas, it's 350. They want to play Beat Saber. It's a splurge moment. The Vision Pro is almost never a splurge moment.
Yeah. I think it's all about positioning from Samsung. Totally. In the way that they position their phones against other phones, they have to- Did they even position their phones? Well, they call the- We know you're gonna buy it. They have the S25 and the Plus and the Ultra, which has to be more because it's bigger, but it's not that much better, but it has to be more expensive. So it's $12.99. Yeah.
That's because they know that there's that market of people that will just get the best thing no matter what. I think they'll see other headsets come out after this and then this one is the premium one in that world. There will be $700 headsets that run Android XR. But I think they're going to look a little different from this one. I think the one that looks like Vision Pro, I think they're aiming it at.
people who might have thought about buying Vision Pro. Somebody who has $3,000. You got to make your comeback. We're going to get those cheaper headsets, but I think they're going to go right up well past $1,000. So that's why I'm going $15.99, $16. For some reason, $16.47 in my head, but that's not a real price. $16.40. Great year. Let's go $15.99 is my official case. I guess I haven't touched it, so I don't know how the hardware quality is. This glass on the outside. The frame is metal. It's not plastic. Remember, Quest is just plastic. Yeah.
And I think this has metal. It had fans. It had that Snapdragon chip inside. Man, you had fans here. No, I didn't hear them. OK, but it had fans. It had like machine, nice machined buttons. It had the braided cable for the battery. Like it was definitely supposed to be pro. Yeah.
So, yeah. Did look like a vision pro. You had the top shot of it on your head, but above, and it has the same freaking button. I said in the video, and people got a mad because I compared it to vision pro so much, but I was like, it's not a digital crown here, it's a button, and they were like.
Why do you keep using vision pro as a reference? Because Samsung did. That's why it looks like a vision pro. And it's also the one that like a lot of people have seen that videos 30 million views or something like people have seen vision. Yeah. So this is this is the reference. This is the.
Android version of the thing you've readies. This is, we've got Vision Pro at home. There you go. For people that want to buy it for their kids. Do you want to make an official price guess? $9.99. $1,799? $9.99. Oh, I feel like it's gonna be more now. It is. It's really not okay. I'm telling you, I'm giving you a chance. $12.99. Okay, final answer. Okay. Lock it in. Now we should do trivia. Okay. Okay. $12.99. Trivia, dude. So.
David Kaz and Damien Enri were two French Google engineers. With their 20% innovation time off, they made something that was introduced at Google I.O. 2014 Developers Conference. In fact, everyone that attended walked away with it. What was it? I have an idea.
I was there. I think I went to that. I think you did too. What was the thing? 11 years ago. I should know this when we were not old. When we were we lads. When we were we lads. All right. Well, we'll think about this one. Answers will be at the end. Like usual, we'll be right back.
Support for Waveform comes from Life360. So we'll be the first to tell you that not everything goes to plan. Life comes with its own set of curveballs for you to handle, especially when it comes to protecting your family. But while you can't fool proof your life, with Life360 you can family proof your family. So Life360 is a location sharing app that makes family life easier. Knowing where everyone is at any given time, making coordinating daily routines and activities a breeze.
And you can attach a life 360 tile tracker to all of your family's important stuff and then track that stuff down right within the app. So whether it's your keys, wallet, even the family pet, you can get on a collar.
I have many things that tend to go missing in my life. Generally, my cameras tend to go missing. Ubers famously tend to go missing with my cameras. There was this one time that I definitely left my camera in an Uber and it was driving around, but I knew that it was in the Uber because I had a tracker on my camera. So now all my most valuable cameras that care with me a lot, they have trackers on them. Perfect. Yeah.
So this year, you can stay connected with location sharing and stay coordinated with place alert notifications for when someone arrives or leaves a given location. So you can family proof your family with Life360. Visit life360.com or download the app today and use code WAVE to get 15% off. That's life360.com code WAVE.
Support for Waveform comes from Hostsinger. So making your presence felt on the internet can be a challenge. Sir, you can try to make all the viral TikTok videos you want, but if you wanna have a place where people can interact with you and be able to check out your product directly, you need a home base. And not just one of those cookie cutter ones that we've all used, this is your brand. It's your aesthetic and your footprint on the internet. So that's why you might wanna check out what you can make with Hostsinger. So whether you're looking to create a website or an online store, you can do it using Hostsinger AI website builder.
Just describe your site, name it, and get a personalized site in seconds. And we're not just talking about the design, you'll also have the ability to generate content that matches your style and tone and tailor it to the length and topic that you choose. And you'll also be able to launch your business with a free domain and professional email, included with your plan for the price of a cup of coffee.
Plus, Hostinger provides intuitive tools that anyone can use and 24-7 support if it gets stuck. So take your big idea online today with Hostinger. Head to Hostinger.com slash waveform and start your website today. Use code waveform for an extra 10% off.
This week on Prof G Markets, we speak with Robert Armstrong, US financial commentator for the Financial Times. We discuss Trump's comments on interest rates and who might emerge as the biggest winners from the deep-seep trade. In the world we lived in last Friday, having a great AI model behind your applications either involved building your own or going to ask open AI, can I run my application on top of your brilliantly good AI model?
Now maybe this is great for Google, right? Maybe this is great for Microsoft who were shoveling money on the assumption that they had to build it themselves at great expense. You can find that conversation and many others exclusively on the ProfG Markets podcast. All right. Welcome back. We've got to talk about this deep-seek thing. Yeah. Now here's my relationship with the deep-seek headlines. Okay. I was working on the Project Muhan video. Okay.
and trying to distill all of my thoughts on this headset that I just tried and like over a flight back and like writing it and carving out all the footage and color correcting and editing and animating and all that blah, blah, blah. And then when I uploaded it, Deepseek had already gone through an entire news cycle of it coming out, being at the talk of the town and then suddenly is maybe not so hot. It's still in the news cycle. It's definitely still in news cycle. A lot. I need you to explain what I missed. Okay, yes.
Okay, so it turns out we all miss this actually because the original big deep-seak news, well, part of the big deep-seak news happened on Christmas day. So clearly a lot of us were going to miss this. Oh, wow. Okay. That's my fault for not really digging into the forums on Christmas day on Christmas day for the groundbreaking AI reports. Yeah. Actually, the only things I've seen is number one.
It was an AI model that costs a tenth of what OpenAI's models cost, and it was just as good. Allegedly. Allegedly. And then I heard that it was just trained on OpenAI stuff, and not actually that impressive. Oh, no, they stole our stolen content. So that's my entire familiarity. All right. Yeah, high-level overview. This is an AI language model similar to OpenAI's language model.
Yes, it is specifically a language model. They also introduced an image model, but the language model is the thing that people are mostly talking about. So on Christmas Day, this company called DeepSeek that is this Chinese AI company released this model called DeepSeek V3. V3 is a foundation model. Came out on Christmas. No one really talked about it that much because it was Christmas.
The thing that is causing a lot of stir in the community is that it performs really well. It performs very similarly to things like GPT-4 and like one of Claude's best models, stuff like that. But the probably the biggest thing that caused the stock market to start tumbling and all of these things to happen is that reportedly,
it only cost $5.6 million to train. I think the headline I'm remembering now is that NVIDIA lost like $600 billion of revenue market cap. They lost half a trillion in market cap in one day, which is a whole star gate. Because of these headlines by DeepSeek. Yes. Wow. So if you extrapolate the fact that this costs $5.6 million,
Most of these AI companies basically are trying to create these gates, not gates. They're trying to create these walls on who can compete with them by saying, oh, it costs $100 million to train a foundation model. So only these giant companies like Google and open AI.
and I guess Claude can train their own models to be competitive and meta. So it's semi-open source. It's like open weights. So they released this really big paper about how they trained it. And the paper basically discloses all of these different methodologies that they use that basically take similar methodologies to what the big companies use, but they take the most efficient approach possible, right? I think a lot of the companies like OpenAI and Meta and Google
Their big thing has been, oh, we're just going to throw more compute at it because for a while these AI models scaled with the amount of compute that they threw at it. Eventually, they started slowing down and now they're trying to figure out different ways to make them more effective. For example, OpenAI's 01 model is this reasoning model that takes multi-step reasoning to think through problems, which has a higher level of accuracy for actually solving complex problems.
But things like that O1 model are in opening eyes $200 a month tier. So they have all these insane gates, right, that stops people from actually using them or competing.
I don't think I saw them saying something about how the usage of that tier was so high that they were still losing money per user. Yeah, which is crazy how expensive this stuff is supposed to be. So some of the efficiency things that DeepSeek did to make this a lot cheaper is they use techniques like this technique called mixture of experts. And what that is, is basically if you have a giant model,
For example, this V3 model has 671 billion parameters. You can basically say, okay, well, I'm going to take 37 billion of these parameters and specify it towards things like code. Because when you feed a model a ton of data, you've got a ton of coding data, you've got a ton of just natural language data, you've got a ton of data about different topics.
So it's not actually efficient to do this next word processing thing if you have to run through the entire model and find the next token. It's more efficient to say, oh, I've got an expert on coding in the model here so that let's only run a parameter of 37 billion instead of 671 billion. And that's both more efficient and more effective. So it can be cheaper, it can be faster, and it can be just as maybe even more accurate. Right. And it's also, yeah, it could be more accurate.
And it can also be much cheaper to inference. So if you're actually the end user using it, it costs less per token because it doesn't have to go through the entire model to find what you're looking for. So this V3 model was already a big deal because of how cheap it was. But the thing that's causing the bigger star is just last week, they announced a new model called R1, not
of Rabbit R1 fame. Can't prove it, cannot prove it, but R1 is a reasoning model similar to OpenAI's O1 model. It does the same multi-step reasoning that the O1 model does, but it does it out in the open, which is very interesting. The difference is when you ask O1 a question, it'll say thinking, thinking, thinking, thinking, and then just gives you the answer.
That's what I do usually. Whereas, the interesting thing about this R1 model is when you ask it a question, say you give it some complex question, like, oh, there are four blocks on top of each other. One is green, it's in the middle, one is red, and it's here. If I were to move the green block on top, can you tell me the order of the blocks? This is an example from the computer file YouTube channel that I just ripped off.
Now, normally it would get those things wrong because it's like very difficult to take all that information and understand the physics and all that stuff. But in a reasoning model, it's similar to a human being who writes down the problem and you have to work it out, right? Because you can't just instantaneously come up with that answer.
And so it will step by step tell you, okay, so theoretically, if you were to move that block to the top, then now the blue block is below the red block, which means that blah, blah, blah, blah, blah, blah. And OpenAI used to hide all of that compute.
But R1 shows it step by step, which is very interesting for a lot of people because now it shows you how these models are actually working, which is really crazy. I kind of like that. Yes. Because it's not like sourcing it necessarily, but it's kind of interesting to see it and maybe catch a mistake in the process as it happens. Right. That's cool. Right. The other thing that is interesting about the way that they trained this
is that generally the way that you train a lot of these AI models with complex problems is you would tell it the question, you would tell it all of the steps in between working out the answer and then you would tell it the answer. And that would be the data that you feed into the model. So you're just feeding like all of these worked out questions into a model, which is a lot more data. And it's also not really learning how to do learning in the broader sense of the word.
how to actually work out the reasoning for these questions. It's just kind of mapping, here's the question, here's the reasoning, here's the answer. And so it's less likely to get those things correct, takes a lot more data, et cetera. This time, they're basically just giving it the question and they're giving it the answer.
And they're basically saying, you get a cookie if you get it right. And you don't get a cookie if you get it wrong. And so now they're sort of training the model to, instead of just mapping the question to the work out to the answer, they're actually kind of training it to actually do that reasoning itself. And that is way cheaper to do as well.
So yeah, the thing that is crazy about all of this is because it is so much cheaper and they did it also on a bunch of NVIDIA H800 GPUs, which are the older, less fast GPUs that they were able to buy before the export ban because currently
under the Biden administration, NVIDIA cannot sell H100s to China because in America, they want to be the fastest, most brawniest, you know, never. So they're saying, oh, we did this on H800 GPUs. We did it on like 10 to 50 times less GPUs. It only took two months to train instead of three months
that it took to train the OpenAI model, and the OpenAI model also had way faster GPUs. So suddenly, in order of magnitude less GPUs, 50x less GPUs you can use, and videos like,
People aren't going to buy as many of these anymore. I see. That's why I relate to NVIDIA. Deepseak, it sounds like it's a genuinely innovative and useful, much more efficient, much less expensive, both to use and to build model.
Sounds like this is like the Liado Mega of language models. People are considering this a Sputnik kind of moment. Sure. If by people you mean Mark Andreessen. Well, no more than just Mark Andreessen. In that it's like a new innovation that's not from the US.
In that we kind of like, yeah, a lot of US companies have been like, we're the only ones in town. And then out of nowhere, someone releases something that's tech like better in some circumstances. And so it freaks everybody out. So what happens when you take this efficient model and put it on Google servers that are running like a hundred thousand NVIDIA GPUs? Like, would it just be that much better?
So there is this paradox that I cannot not remember the name of that people bring up a lot in regards to this, where if you can run something more efficiently and cheaper, more people are going to want to use it. So it scales with with the efficiency. So it's like.
Yeah, if you have more servers, you're still gonna use those servers, so, you know, it'll be cheaper to run. I've heard the same thing. But more people are gonna use it. Yeah. Yeah, there's some paradox around it. And that you just like fill the demand with your savings. Yeah. Right. The other thing is that theoretically now you could run this on like a 4090 or something. You don't need like these super computers anymore to be able to run these models. And so if you're like a university,
and you've got a small cluster, but it's not like an open AI size cluster. You could run a local version of this that's tailored to specific tasks locally on your servers, and then you don't have to pay open AI anymore, right? So this is the other thing. Because the weights are open and people are trying to like re-engineer it,
and it's free to use and the research paper is out there. It's kind of open sourcing this thing that these few companies have kept closed for a long time because they want to maintain this monopolistic kind of leadership stance. Wow. Yeah. Okay. So that's the first half of the news cycle. I can see the reason for the hype and like all the the stock movement and stock prices and all the headlines and everything that
But then there was another half of it where it sort of cooled off or at least started getting broken down and exposed more for what is actually happening. Is that also? Yeah. I mean, people are finding technicalities. So they said that the last run that actually trained the model is the thing that cost $5.6 billion.
billion dollars. Million. Yeah, million. 5.6 million dollars instead of from zero. Yeah, instead of from scratch. Like all the R&D hours, all the other compute, all the data collection, obviously that costs a lot more money. So it's kind of like the same thing as your bill of materials thing, right? Yeah. It's like, what did this actually cost full on? They haven't released that. So it's a little bit different. Has anyone asked Carl pay?
other thing that I think is very funny is they're basically saying that there's this other thing that you can do when you're training models called distillation where you basically take your model and you take outputs from a good model and you use that as training data for a new model.
and a lot of people are thinking right now that they used outputs from like GPT4O as data to train this model because you can create a much smaller model with better data. And if the good data is in the outputs from another AI model, then you can do that. And so a lot of people are freaking out. But also a lot of data. Exactly. So a lot of people are freaking out because they're like, oh my God, they might have stolen open AI's outputs, but it's like,
Boohoo. Yeah. I guess that makes sense because that first thing you're saying that I was like, okay, so we're just making another GPT four. It's like, what's the point? But it is still cheaper and in more efficient to run and all those benefits. So even if it's giving you the same outputs as GPT four, it could have advantages. So I guess I get it way cheaper to run, way cheaper to train. You could train like a,
a local version, you could use your own local version. Chain of thought reasoning is really cool. Just a lot of things that I think a lot of people were not expecting. And then it also made it seem like, okay, maybe we don't actually need these super computer clusters the size of Manhattan to be able to run these things.
Also, the entire nuclear industry kind of crashed on this too, because people were investing in a lot of nuclear companies because they were thinking, we're going to scale up AI clusters so much, we're going to need nuclear power to run this. And now they're like, actually, you need 50 times less GPUs than you thought. That's really funny. And so now the nuclear industry is like,
That's really funny. I mean, I am not the expert on this, but in a far enough future that I'm envisioning, envisioning of people using AI all the time, you have to be able to do it more efficiently. Efficiency has to be one of the focuses. We can't just keep scaling up to just bajillions of GPUs everywhere. I mean, that's what we've been doing. That's what NVIDIA wants. That's what we've been doing. And if you follow the graph of NVIDIA stock price, you can see how we've done that.
Exactly. But it seems like, yeah, efficiency also has to eventually be a focus in some of these. I'm glad that it's at least started to highlight that. I think in most industries and markets, when more people have access to innovating on a thing, you get so much innovation that the efficiency and the value proposition just go up exponentially. But this market is so closed. You only have OpenAI, Google, not even really Apple, Claude and Meta.
That's that's sort of it. There's a bunch of small AI companies that have spun up, but they're all basically using open AI as their like foundation. So yeah, I think this just kind of shows that open source is going to cause a lot of commotion. Fun. Yeah. Pretty big deal. I'm here for it. Pretty big deal. Also, apparently, if you use the, there is now a deep, there's a deep-seek app that you can have that is now the number one app on the app store. Sick. That was fast.
It will apparently not talk about Tiananmen Square, because it's still a Chinese app. Yeah, so is it going to get banned? That's so many things that are unanswered about this. I would not doubt it if that got banned. If you run it locally, it will tell you whatever you want. If you run it through the app, which is going through
the Chinese filtration systems. It will not tell you whatever you want. I just want to go sign up and it says, due to large scale malicious attacks on our servers, registration may be busy. Please wait and try again. Yeah, they're basically saying they got DDoSed. Yeah, everybody's probably they have a kind of a magnifying glass on them right now. So that doesn't really shock me. But yeah, okay. Yeah, things are happening.
So that was a big deal. NVIDIA stock price has since kind of gone a little bit back up. There have been many think pieces about, well, it costs 5.3 million to train. But what about all the other money? And what about this? And oh, if they're using distillation of open AI data, then the actual true cost is blah, blah, blah. And it's like, OK, I feel like that's a lot of hand waving, personally. I don't know. I think it just shows that if you make something open, way more innovation can happen.
and also we're not the only ones in the AI race. Pebble. Do you know this name? Pebble. The word Pebble. And it's not the social media network. Pebble brings me back because I, Pebble is the first smartwatch I ever had, technically. It was before Moto 360. I had like a red Pebble. I definitely did a video on the thing too, a red Pebble with a black band.
And all it was was an eink display on my wrist with notifications. Yeah. I didn't really want it to do too much else. It was just so I could have less screen time. I could just check my wrist and be like, oh, I can see what time it is. And then I just got a text and I'm not even going to read it. And there it is. Yeah. That's my pebble experience. Yeah. So pebble was one of the most successful kickstarters of all time. At the time, it was the most. At the time, I think it was the most. And then, but they eventually went out of business, right?
Yeah, okay so they raised a ton of money there is 10 million dollars at their first launch when they expected to raise a hundred thousand dollars they came really popular with nerds they released a bunch of new pebbles throughout the years they lasted about four years and then eventually fit bit bought pebble.
Fitbit bot pebble. Fitbit bot pebble. And they shut down and they shut down the development of the pebble to and the pebble time to and then they refunded everybody. It was a whole thing. And then if you remember, Google bought Fitbit. Oh, right. So there's just been a bunch of like distillation of the IP of pebble. And not a lot of people have used e-paper displays since pebble. So that's something we didn't mention. It is a smartwatch, but it is an e-paper display smartwatch part of the
kind of cell of pebble is that it lasted seven days. It did very basic stuff. It was very nerdy. It was a very open ecosystem. You could develop for it. It was this awesome community. You could see it in sunlight. You could see it in sunlight. Tiny three pin charger. It was pretty solid. I remember that now. The battery life thing so long.
Yeah. Now something you might not know is the founder of Pebble is Eric Mijikovsky, which is the same guy that made Beeper that we talked about a number of months ago. Beeper was that app that basically hooked into iMessage and kind of temporarily disrupted Apple until they shut it down. And I was completely wrong about whether or not that would last.
But here we are. So Eric has been apparently rallying Google for a while now to open source Pebble because they have not been doing anything with it. Yeah, seems like Google forgot. I bet Google bought Fitbit and forgot that they had Pebble stuff in there. Yeah, I think they also forgot they bought Fitbit, but yeah.
So yeah, he basically was able to rally Google to open source all of the pebble stuff. And because of this, he's bringing pebble back. And eink slash e-paper has gotten a lot better since 2012, 2014. So bringing pebble back could be kind of neat. Like, I don't know, I'm still picturing the same general premise, which is a smart, raw-type device on my wrist with an eink display.
a week plus battery life, and a decently fast enough refresh that I can get notifications, I can swipe them away, maybe I can control media on my wrist or something like that, like basic stuff like that. I think that would still be pretty cool. I think that smart watches generally are very overpowered for what a lot of people use them for anyway, and just like on a lot of smartphones, people would trade battery life for functionality a lot of the time.
So Eric is basically starting a new company that is going to use the Pebble OS, but he cannot call the company Pebble because Google still owns the trademark. So he basically put out this website with a signup form where you can sign up to get updates on when new Pebble stuff is coming out.
They won't call it Pebble, but they'll probably have. I'm predicting now some really punny rip-off name from Pebble. That reminds everyone that Pebble existed. Yeah, I believe the website is called Repebble. There you go. There you go.
Okay, I got on a call with Eric a couple of days ago to talk about this just to kind of do some Q&A. You will be able to build on top of the new hardware software because pebbles can be open sourced and everything that the new company builds on top of on top of the software they're putting back into the open source project. Yeah, which is cool, which means you can build your own hardware basically and like flash it to it. That is pretty cool. Yeah, so this is kind of like an open source device.
They are going to build new hardware, so I would love to see what that looks like in 2025, because like you said, the technology's gotten much better. I want a pebble in the shape of a motor 360. Yeah, without the flat tire. Without the flat tire. Well, they had the pebble round. They had the pebble round, and it was basically in the shape of a motor 360. Bigger, please. Bigger screen. Bigger screen.
Okay. Other cool thing, because it is open source, it can run on basically anything that has a microcontroller. So if you want to flash your pebble OS to your fridge, you could probably do that. Someone will probably do that. Someone's going to run doom on their fridge on pebble OS, for sure.
They are planning on sticking to the core feature set of the original Pebble. Eric says that not everyone wants all of the features that you're getting from the Apple Watch and all the stuff like that. So that's exciting. He put out this blog that was about the things that he learned from having Pebble fail in the first place. And someone is working on a port right now so that you can develop on top of it. So it's pretty exciting. The more you talk about this, the more I'm picturing a pretty simple, clean smartwatch.
Like simple watch face, step tracker, shore, maybe heart rate. I don't want too much more than that. Like keep it easy, keep it simple, keep it like a week plus battery life. I think a lot of people would be into that. And it would probably be cheaper because you don't have this huge expensive LED display. It has to get super bright. Okay. I'm in. Yeah. Eric had told me he was going to CES to talk to vendors and I was like, what for? The vendors? For now. Now I know why. Yeah, exactly.
I'm very curious to see what it ends up looking like for people to build on this thing because the GitHub right now is like all written in C, which is very, to me, difficult to understand. And so I wonder if there's going to be like an easier way to write for this platform. Yeah. Remember the play date? Yeah. That thing is like the platform to develop for it was a custom software that they released. Right. It's super easy to get game builder. Yeah. So I wonder if there's going to be something like that.
That'd be nice. Right now, he only has four part-time employees working on this. That's all you need. It's all Google has on it, too. They forgot that those people were still working for them. I'm very interested to see if the community that loved bubble in the first place builds this backup.
We get some normies on this thing. I think a lot of people are starting to feel a little bit overwhelmed with how much their technology is tracking them and the overkill of everything right now. We've innovated past the amount that we need. I think the reason that Pebble initially took off and had one of the most successful Kickstarter ever is going to be even more true now.
I think people will still like this. What was the reason in your head? Well, it was a way to use your phone a little less. Like you would get the notification on your wrist so you didn't have to take your phone at your pocket. And then the phone, the notification would say like, it's a text from someone or it's an email or whatever it is. And then you can just put it down.
And then it would just go back to showing the time. And if it's simple, beautiful, shows you the time. And you're not poking around and doing all the gestures and checking your heart rate and all that stuff. You just want a simple smartwatch. I think that's, it doesn't have to be $700 smartwatch. It could be a 149 smartwatch. And it could do all the things you want. The animations are also very nice.
Yeah. Um, also this came out when the iPhone four S was the newest iPhone. So. Back. Yeah. Throwback that nostalgia too. Yeah. I think so. It's the square one before they got round and then eventually got square again. I think the five S was the first one I ever reviewed. Yeah. Wow. Five S. Wow. That's to me that's nostalgia. Yeah. Pre. Yeah.
Wow. Anyway, about all for that. We'll keep an eye on it. Okay. Cool. All right. Well, we should take another quick break. We got some something or nothing coming up. But before that, trivia.
Trivia. Not to pronounce. Yeah, I know. Yeah, it's lacking. So at CES 2018, Lenovo launched a standalone headset running on Google Daydream's platform. Do you remember what it was called? But the Lenovo headset was called? No. Q87986X. Close. You have to put a think pad in front of that. The think pad face. The think pad elite Facebook. Facebook.
Damn, I have no idea. I'll be totally guessing at that one. Yeah, answers at the end though. We'll be right back.
Welcome back. So now we are going to do a game that we call something or nothing. Let's play a game. Let's play a game. So the rules are simple. I'm going to read a headline and you guys have to tell me, is this something that we should care about or is this nothing at all? And I got you. Yeah. Nothing at all. Nothing at all. First up, iPhone SE 4 leak shows single camera and no dynamic island.
Is this something or is it nothing? Nothing. I'm going with nothing. It's like the one thing that's interesting about it is it usually the SE is in the body of an old iPhone that's already happened. Yeah, which is why they can make it so cheap. It's because of the part that they make already. I agree. This one.
slightly air quotes new design, which is like, oh, they're making a new part for this thing. Other than that, single camera does not surprise me, no dynamic island does not surprise me, so I'm going nothing.
Yeah, they're keeping the dynamic island for the nicer phones. The SE is the phone I was by my mom every seven years, so I don't think she's going to worry about the dynamic island. You guys don't think it looks beautiful? The dynamic island? No. This is like a five more. I like how it looks. Yeah, there's a week to five almost. Whoops. I feel like this is going to be Ellis's next phone. Oh, 100%. That's because it's small because it's small. It's not even that small. Yeah, I was going to say, oh, you're right. Isn't it like a six inch screen? It kind of looks like an XR.
You mean 10R? What? 10R? 10R? The 10R, what they call it in the UK. Yeah, I'm sticking with nothing on this one. All right. Yeah, nothing. Nothing. So we shouldn't care about it. Rumored to be an OLED. That could be cool, but still probably. Still not crazy. Still nothing. All right. Next up, OpenAI launches GPT for government edition, less than nothing. Is this just going to be a repackaged GPT that's more expensive because it's government?
Uh, yeah, cool. I think it's also like specifically tuned to government agencies. Um, I think it's just ways for open AI to milk the government. Yeah. So the top comment on the vertical is why not deep seek? It's open source.
and then topper-wise, yeah, it's gonna be banned. Yeah, it'll probably be banned within a week. What's the over-under that deep-seek is banned in a week? No way. It can't get, it doesn't have the clout yet for like that guy in the Oval Office to do anything about it. It took like a trillion dollars off the stock. That's actually very fair. Yeah, totally fair. I do think that, yeah, I think they need Congress to be able to actually ban it, but I feel like they're gonna ban it. Wait, so what's your name first?
Which will be banned first, lemonade. 100% deep. No, deep seek. Nobody remembers lemonade. I got 20,000 people over there on lemonade. That's actually true. I got 1,700 on pixel fed. You know what? I have the 18th most popular photo ever on pixel fed. Wow. Is it a good photo? What does it have? I would say so.
It's of Glacier National Park. Okay. You can even break top 10. So disappointed. Wow. Ever, Adam, I made my account two weeks ago. No, that's a flex. That's a flex. Yeah, let me flex. I respect that. Anyway, GPT for government edition, I'm going nothing. Nothing. Nothing. Nothing. All right. Yeah. Next up, threads is officially getting ads. Something. Everything. Everything. This is everything. I called it. Didn't I tell you guys, like,
Obviously, threads had their moment of exploding onto the scene, 100 million users in five days, whatever it was. The headlines are all like, this is going to overtake Twitter, this is the thing, we finally have a competitor. And it's meta, so meta's sneakily in the background like, yes.
Yes, come to threads. That doesn't add the business. But eventually, yeah, it's meta. We knew that they were eventually going to do ads, so we knew they were building in the background, waiting to find a good time to turn it on, and they're going to turn it on. And it's going to be another meta property with ads and what else is new? Yeah. But I'm saying it's everything. It is everything because this is what they do. Yeah. And now they've acquired a lot of users. And then people will migrate to blue sky, right? That's the next move. Come on, baby.
Because blue sky now thing about blue sky 30 million yesterday. Just saying they passed 30 million users blue sky will blue sky get ads. I have no idea. 100% because they are run by a company. Obviously they're run on that open protocol, but they're still run by a company. Correct. Service cost money. Service cost money is employees cost money. So just blue sky has to do. You know, blue sky doesn't even use AWS. They run their own servers.
expensive. It's actually cheaper than AWS, which is with their current volume because AWS is expensive because it settles up with volume. So if you're 30 million users are active on this thing, it costs a lot and they charge you for convenience, but you still have to buy those servers and like, yeah, whenever this will cost money, it costs money, but it is technically cheaper. But yeah, I have been talking to the blue sky crew a lot recently and they said they are thinking about
the right time to move the AT protocol into the internet engineering task force, which if you watched our, hey, if you watched our special episode, we all just glazed over. Come on, secret history of the internet, the ITF, they're the same guys that do email. Shout out to them.
They have one of the nerdiest sounding names of all time but i appreciate that daddy vent yeah daddy internet i was emailing with vent uh this weekend it was crazy it was crazy i was like i drop access it yeah vent you mean big v internet zaddy our father anyway uh yeah so blue sky public plc wants to eventually move the at protocol into a
a standards body because they want to instill that trust. Because currently, Blue Sky PLC is the same company that runs Blue Sky the platform, and the whole idea is that they are not the only one on the protocol. You're saying something. Next one. Everything.
It is something for sure. It is something in the way meta moves. Do you think this will matter? What adds? When adds hit, there will be another moment, another reckoning for everyone on threads who has been preaching the word of threads, and they'll have to decide if they're cool with it or not.
Yeah, but most people haven't been preaching to word of threads. They've just been on Instagram and then they're like, what's this? Click the button and now you have a threads account. Adam, you're going to start me right again. Well, there's that movement. There's also the I'm leaving Twitter movement. Yeah. And then when it gets ads to have to be like, I'm cool with that. That's not what I hated about Twitter. And then that's exactly my point. Everyone is used to ads already. So yeah, they're exactly.
Maybe it's nothing. Maybe this is going to happen the whole time and it's just going to be background. It's everything for meta because it'll probably boost their revenue by a lot. They have WhatsApp and Instagram with 2 billion users. Yeah, but a cool 100, 200 million more isn't that bad. Yeah, I could use it on the 20 bucks, I guess. Yeah.
All you need to do, blue sky, if you want to make all that money back, make very highly customizable ads, highly customizable profiles where you can have like a pink background and a song that auto plays when you go to my profile. Nice, I love it. Next on the list is open AI's new operator AI agent can do things on the web for you. So did you read these articles about this? No, I did not. Okay, I have an understanding of what it sounds like it's doing based on a headline, but yeah, maybe is there something deeper?
Well, okay, so operators, this new feature that they have with the expensive model that they have. And basically the $200 one. Yeah, that's crazy. Basically, it's in research preview mode right now. The idea is that you can say like, I want to go on a vacation. Can you book it for me? And it'll go to like TripAdvisor. And then it'll- I still think this would be a banger video. We should do that.
Having a I like a full trip to like Boston or somewhere nearby and follow the itinerary. So they have like, it's very weird. They have specific partnerships right now where it'll use specific websites and services for all of these things that it wants to do for you, which feels very rabbit or one to me if I do say so myself.
What did they call those agents? Agents, yeah. Oh, same thing. Yeah. It's basically an agent that goes and does things for you, but you can see it like clicking around.
Um, it was funny. I believe Casey Newton did this and it booked it for like the completely wrong date. Nice. Something like that. And everyone that has been using it that has been talking about on Reddit says that it's really bad. Like it's, it barely works. Um, if it works, it's something, but, but if it doesn't work, or if it works as well as rabbits,
which is not at all. Which I mean, okay, it's opening eyes, so maybe I have to give them benefit without, but I don't know. I don't know. It's an reason for me. I am heavy, me personally, in the camp of I want to overthink each individual detail of the things that I'm doing. So instead of asking an AI agent to buy something online for me, I want to just double check it's going to the right address and that I will have so many things. So many things. And if I'm booking a trip, that's even more.
I tend to think it's already easy enough to do most of these things online. The capabilities and the trust level have to merge, and both of those things, the capabilities are moving at one mile an hour right now, and the trust levels are going the other direction. I don't know. I think currently it's kind of
Kind of nothing. And I also don't really think that this is the way that we're going to be using AI agents in the future. So like Mark has said, I think people just want to make their own decisions. I can just book a flight. Let me look at the flights too. Instead of me going, hey, book a flight to Boston for me for this weekend.
I want to look at all the flights, pick a time. The one at noon is actually cheaper than the one at 3PM, so let me do that one instead. Now I'm going to plan something to do when I land at this time because I want to do that. If you can do things like check the available flights and give me options quicker than I could do it myself, then that's cool.
There's probably CEOs and stuff that are looking at the $200 a month price, and they're like, yeah, book me a flight. Just do it. I don't care which one. And they'll just do it. And they don't think about it. And then I'll get to the airport, and they're like, sir, you're going to see a earthquake. And it's in three months. I need your passport.
I'm going with nothing for now with nothing for now. Interesting. Okay. Next on the list. iOS 18.3 officially launches, which makes notification summaries italic. If this ain't something, I don't know what is. That's not the official headline. That's what I wrote. Basically, Apple was getting a lot of flack because the notification summaries were saying crazy stuff.
and it just lying a lot of the time because it was incorrect about a lot of things and trying to, especially because it was trying to group multiple things into one sentence, like multiple different events into one sentence. And topics. And now, when it has a notification summary, it is italic to signify to the user that it's in beta or whatever. Something. Really, I also think it's something, shows that Apple's listening to something. They just made it italic.
True. Do you think, okay, this is my question. I have not talked to like a normie and been like, hey, your notification summaries, is it more obvious that it's an AI guess now because it's italic?
because I actually hate how it looks now. The italic look is not a good look. So it has to just look different from regular notification. That's the point of the italic, right? So, okay, Apple understands that people have seen these AI summaries and don't like them, and if we make them look visually distinct from a normal text,
or a normal notification, then that's better than not. That is something. I feel like it's something because this is a red flag for just apple intelligence in general, I think. Summarizations are supposed to be the easiest thing that AIs can do. Take this thing and summarize it.
And this isn't even the full Apple intelligence yet. Like it's still rolling out. Still need that Siri update. So if they're having problems with this, how do I trust them to be able to actually do a thing in three months? I really want them to reach into apps thing. That's supposed to be March. So yeah, I don't even know if there's problems with this. Is this just not a useful app? Not a useful place to deploy summaries. Like when I get a notification of like one single email,
I don't need you to summarize the email. I'll just look at the subject line. Like that's already good enough. When I get subject is a summary. Yeah. When I get three texts in a chat with someone.
I don't know, it's three texts. Maybe a summary is useful, but all I really need to know is I have three texts from this person. Context in my brain is like, oh, I remember what I was talking about with them. That's the summary. I think there's some value to the urgency of the text. You know what I mean? If it says urgent requests needed, then I'm more likely to stop what I'm doing and go deal with the text than I am. If it's just like, this person texted you three times, and I'm like, I'll deal with it later.
Notifications are inherently useful though. Like, yeah, we get a lot of them and they're annoying, but they're there for a reason. So I feel like adding this layer of friction in front of the notification to like summarize it that may or may not be accurate is not really what people want. Like it wasn't to go into these. Yeah, they are healthy. If I could just go and open the app or like swipe down in control center or something and then choose a summary notification, then sure. But to offer it like right off the bat by default, it's like just another step.
Apple only really added AI to appease the shareholders anyway. Bar. Bar. This is nothing. This is just more of the nothing.
Yeah, more of an apple intelligence is nothing. You know what happened? I tried to use apple intelligence on my Mac when it launched on Mac and I used it on an apple note that was a whole script of a video I had written and it summarized it and then I hit Control Command Z because I was like, oh, and I'm just testing the summarization feature and then it crashed.
And then when I reopened it, I could no longer command Z it. No, I lost the entire script. Damn. So that was fun. Damn. Don't test in production, my friend. Oh my gosh. That's a bad thing. So I should have copied that note, but that's on me. Is it? Is that on you? Yeah, no, it's on you. I'm going to blame it on Apple. That's on them.
AI is bad for intelligence. Next up, David did not win a ticket to the Switch to Showcase in April and neither did Adam. Everything. This is everything. This is everything. I agree. Switch is so hype. Switch is so hype. You had to win a ticket to look at it. Yes, that was happening. Correct. You say you applied for a chance. Three chances to look at the thing that you might give the money for later.
No, I will give the money for it. This is crazy. Let's get back clear. It is out of control. Dude, it is out of control. Okay, so now that you're not able to look at it, you're still gonna buy it anyway. Yeah, probably. So you just wanted to ease it to look at it just to look at it. Yeah, so they basically, they were gonna have three different showcase events where you could like play with it early, but they had to ticket it and they had to have a raffle. And so they had multiple sessions for New York, Austin and Los Angeles. Wait, is this paid?
No, no, okay. It's a free event, but it's just a hype cycle. This isn't, this isn't intended making more money. This is them just driving more hype. Yeah. People will show up with a phone and make a 30 second video of it on their phone and put it on YouTube and shorts and TikTok and it'll be like, oh my God, I want the switch and that'll be it. Yeah. Yeah. That's taking my content strategy. No, that's smart. That's smart. Yeah. I'm tend to do smart.
Everything. And the last one, Booms XB1 Supersonic Passenger Jet goes Mach 1.1. I put this in here.
Yeah, so I'm assuming you think it's something. I think it's nothing. Oh boy, this is something. The return of the, what was it called? The Concord. The Concord. If you have time for a tiny story time, I can tell you the story. We have nothing but time. Okay, so passenger jets. And that's it. Thanks for watching. No, the Concord was like that super, super fast, supersonic plane that did like cross country and transatlantic flights in like no time because it was super fast. It was.
I think it was twice as fast. It was much faster than it was, so it was basically twice as fast as normal passenger jet, much faster than the speed of sound. The speed of sound is like this benchmark of, well, obviously, supersonic treatment. It's a benchmark for speed, for planes, perfect. It's like 800 miles an hour, right?
the concord that crashed, you know, sort of retired that plane and we went back to not supersonic travel. But the thing that really hit me about this is, and I remember we probably talked about this months ago, but what did you used to do on a plane?
before the Concorde, when you had a six-hour cross-country or trans-continental flight. Nothing. You just sat there. There is no internet. There is no TV. There is nothing to do. You just sat there. You could read. And so you were highly incentivized to get the faster flight so you could save more time, get to your destination, and then start doing things again. Now, if you have a six, seven, eight-hour flight,
You're on the internet, you're on your phone, you can do email, you can watch a movie. There's a whole bunch of stuff you can do. So people kind of stopped caring about getting there twice as fast. And they have slowly gradually ramped up this like prototype of, okay, we can go faster and faster than they got to like just under mock speed. And they just did a test run where they went mock 1.1. So over the speed of sound, their eventual goal is something like mock 1.7. They want to be able to go twice as fast as passenger jets. Basically you can go anywhere in the world from anywhere in the world.
up to 5,000 miles in like four hours. Now, it's going to be more expensive. It's the first one. It's going to be pricey. I think United bought like 15 of these things and they're going to slowly start making them. They got to make them though. This is still like the testing phase of like they're okay, validating. We can go mock 1.1. We still got to go faster and make sure it's safe.
But we have purchase orders, and when we start making these bajillion dollar jets, people will be able to buy a ticket and go from New York to London in three and a half hours. Sick. Okay, but one thing. We live, we record right next to an airport. Yeah. When you go faster than the speed of sound. Big, big time noises. Yeah. Yes. That's...
Not ideal. One of the downsides. But there are certain slots of airspace that are approved for supersonic travel. So over the ocean, for example, or in the corridor for going from New York to London or certain paths where you used to be able to go supersonic, they did this testing in one of those corridors.
So there are places where it is acceptable to go supersonic. And so I think the idea of being able to get anywhere in no time and probably also still be scrolling your phone on the way, I think is pretty sick. That's why it's something. To get to the West Coast in two hours, that'd be amazing. Think about that. That'd be crazy. Think about that. The flight to California is typically five and a half hours for us. What if it was two? In 20 years.
Well when they finish making the jets and they start shipping them in the tickets come down a price because of a bunch of jobs so that's what's wrong with them and then yeah, but I get on that first flight is what i'm saying no nobody's on the first flight i want to be none of you are allowed to be on the first but the idea is it will be something in.
10 20 years when we are back to these super sick like the jets we fly on now are so old. Yeah, they're not even necessarily old. They're just the same technology as forever ago and we don't really.
We're more concerned with like, I mean, we're cramped back there, the legroom's not that great, the internet's not that great, the TV is not that great. We would like to do some upgrading, so I think it's something. I'm excited. I wanna just make trains. Because it would take a while. I have to train. I have to channel my inner Ellis here and say bullet trains exist. These are different, bullet trains exist to certain land-to-land destinations. Yeah, not in the US. But not to London. Not yet. Not with that attitude.
big tunnel, big tunnel, an undersea tunnel, it's a very big tunnel. And you can put your Tesla on a little track, and then it'll go, it'll, it'll, there can be, all I'm saying is there can be tiers of travel where like there will still be regular flights like this, but there will also be, this will be an expensive ticket. And this will be like, okay, you splurge, you take the bullet train and so the regular train and you get there in half the time.
And it's just an available option. I have a proposition for you. They call the rich people on the concord. They come to you and they say, hey, you can be the first person on this flight and get this exclusive video. No, no, no, no, no. I don't want to be first. I don't want to be first.
So they're doing these test flights. It's like a Navy pilot and they're going in this specific corridor and slowly ramping up in speed. That's all great. Don't put me on those. Then they're going to get to Mach 1.7 and they're going to be like, OK, we can do this at this speed and it's safe.
Now we can it's almost like car testing like when you see the top speed runs from like the hennessy venom f5 or some random car that's like you have to do the top speed run in both directions with all the safety equipment and the professional driver once you get past that stage then you can ship them to regular people and regular drivers. And so you can put real passengers on it and not worry too much by the way.
I'm just tangent city. These bowings can go faster, just like our cars can go faster. They operated a certain speed in certain jet ways because it's most efficient for fuel. Sometimes I get on a plane. This thing is just going to say screw it and just book it. No, it's going to be able to go even faster, but it's going to go up Mach 1.7 because that's what's safe.
mock four, let's go. Sometimes I get on a flight and we'll be like 30 minutes late from the gate and the pilot will go, we're gonna try to make up some extra time in the air. And you're like, what do you mean? And then I'll feel like we're going faster and you can, and you're like, yeah, we caught a jet stream with some downwind and we're actually gonna get you 15 minutes early. And I'm like, you could have done that the whole time. So I'm like, why don't they do it the whole time? Because it's like going faster in any vehicle takes more fuel. So that's gonna be more efficient. So this is a plane that's gonna be capable of going,
like, whatever, mock two or something, and it's gonna go mock 1.7, and it's gonna be great. Well, I, for one, am excited to be one of the first to get on one of the supersonic jacks from Boom Technologies. Boom. XB1. Coming to an airport near you. Hard pass. We'll see. We'll see. Okay. That's all I got. Is that it? Yep. That is it. That means that we should figure out those trivia questions. Figure out, I think you mean,
Where are you going with this? Hell out of my answers. I already know. I have one of them. I don't know. Question one. David Cos and Damien Enry were two French Google engineers with their 20% innovation time off project. They made something that was introduced at Google IO 2014 Developers Conference. In fact, all attendees walked away with one of these things. What was it?
All attendees? Well, I wasn't there, so I can't confirm that all of them walked out with it. Did they get you freaking Nexus devices that I owe? Yeah, now you got like a water bottle and a T-shirt. Yeah. Disgusting. Although, pretty great water bottle. Two and a half trillion dollar company, and they gave me a water bottle. And the same one as last year, too. Flip and read. What do we got?
I wrote Google Cardboard, but I added a diagram to the bottom. I also added Google Cardboard and added a diagram to the bottom. So we should go by whose diagram is better. Well, first of all, are we right? I was going to say you are both. Okay, good. I have three dimensions to my diagram. Oh, okay. Oh, Marquez wins because it is VR 3D. You need three dimensions. Hold on.
here. Although cardboard is 3D now a sucker. It was literally just a cardboard box that you put your phone in, but you have to fold it yourself. That was what was sick about it. Yeah, it was pretty. And it wasn't. I think like it was a Verizon promo or something like somebody did it with the packaging of the box that your new like Nexus device came in. Oh, was cardboard. That's interesting. Yeah. That's hilarious. Oh, yeah, that's right. Yeah. Yeah, it was awesome.
All right, next question. Going along the same theme. At CES 2018, Lenovo launched a standalone headset running on Google Daydream's platform. What was it called? I don't know if you could tell, but I have complete faith that Android XR is going to be a thing. That's why I keep bringing these. Review questions. I'll give you a hint. It was the Lenovo Mirage. What?
Mirage. Mirage. What? There's another word there. Mirage face headset. I don't know. Flippin' Reed, what do we got? The Lenovo Mirage Boom. Kaying it. But it's Mirage Air. Marcus just wants to fly in that airplane. I can tell. The correct answer was the Lenovo Mirage Solo.
Cool. Wow, that's depressing. Yeah, it's a little sad. Yep. Because you're the only one who can see the thing. You want to be more lonely? They were the only ones that were realistic about it. That's true. That's solo. Yeah, this is the solo. So you know what you're getting into? No friends. This is like the ASUS Republic of gamers. You have a lot of friends now. That's what it's called. Republic. Republic. Democratic Republic of gamers.
Well, I'm glad I got a plan out of that. Yeah, that was fun. Hey, let us know in the comments. Would you get on a flight that goes faster than the speed of sound if the company was called? If they offered you a flight, if you're on the first one, it's free. No, it's free. You're on the first plane and they go, Hey, pick a destination. We'll fly you there for. Oh my gosh. What?
I just remember to dream I had like a few weeks ago. Was it flight related? No, we'll eat kind of. I had a flight related dream recently. It was about, it was about going to the moon on like a blue origin thing, but it was all of us. But we had to do it like individually. And when it came up to my turn, I had like the worst nerves ever. And something went wrong with the system. And I was like, I'm fucking out. You are not putting me on that. They're like, yeah, we had some engineering errors. It's going to be an extra 10 minutes.
That's a fair choice. But Brandon was already in space. Oh, no. Wow. God, yeah. All right. Well, all our witches in the comments, please let us know. Yeah, hit us up. Tell me what the dream means. That's what we really want to know. Thanks for watching. Catch you next week. See you later. We've heard produced by Adam Malia. We are part of the Vox Media podcast network, and our intro to our music is by Vane. So do it. And no less. And no less.
I can hear it. I can hear it. I need to also tune in. Perfect.
Was this transcript helpful?
Recent Episodes
The Next iPhone and Pixel BOTH Leaked!

Waveform: The MKBHD Podcast
This week, Marques, David, and Andrew played 'Headlines in a Hat!' There were so many unrelated news stories that we decided to put them all in a hat and pick them at random to discuss. Topics range from the new iPhone and Pixel leaks to cryptic Rivian teasers and even YouTube-viewing habits. Lastly, we talked about tech that we miss. Of course, we wrap it all up with trivia! Enjoy. Links: Home gym video playing Waveform YouTube viewership grows on TV CEO Boom XB-1 Pixel 9a Leaks PhoneArena Spigen pictures of iPhone Music provided by Epidemic Sound Shop the merch: https://shop.mkbhd.com Socials: Waveform: https://www.threads.net/@waveformpodcast Marques: https://www.threads.net/@mkbhd Andrew: https://www.threads.net/@andrew_manganelli David Imel: https://www.threads.net/@davidimel Adam: https://www.threads.net/@parmesanpapi17 Ellis: https://twitter.com/EllisRovin TikTok: https://www.tiktok.com/@waveformpodcast Join the Discord: https://discord.gg/mkbhd Music by 20syl: https://bit.ly/2S53xlC Waveform is part of the Vox Media Podcast Network. Learn more about your ad choices. Visit podcastchoices.com/adchoices
February 14, 2025
Would You Watch YouTube at 4x Speed?

Waveform: The MKBHD Podcast
This week, Marques and Andrew kicked things off by trying to explain to David the absurdity of the Luka Doncic trade in tech terms that he would understand. After that, it was all about the Pixel 4a battery update issue. Then they discuss a new app that Apple just dropped and a new app that Adam is really excited about. They wrap it all up talking about Meta trying to make the Metaverse happen, YouTube experimenting with 4x speed, and Andrew ranting about streaming services. There was a surprising amount of news to go over despite not having any new gadgets to discuss. Enjoy! Links: 9to5Google - Pixel 4a Battery Update Verge - New YouTube 4x speed feature Apple Invites app Partiful app Business Insider - Meta article Tapestry app Music provided by Epidemic Sound Shop the merch: https://shop.mkbhd.com Socials: Waveform: https://www.threads.net/@waveformpodcast Marques: https://www.threads.net/@mkbhd Andrew: https://www.threads.net/@andrew_manganelli David Imel: https://www.threads.net/@davidimel Adam: https://www.threads.net/@parmesanpapi17 Ellis: https://twitter.com/EllisRovin TikTok: https://www.tiktok.com/@waveformpodcast Join the Discord: https://discord.gg/mkbhd Music by 20syl: https://bit.ly/2S53xlC Waveform is part of the Vox Media Podcast Network. Learn more about your ad choices. Visit podcastchoices.com/adchoices
February 07, 2025
Apple Vision Pro Turns One: Spatial Reality Check

Waveform: The MKBHD Podcast
This week's podcast discusses experiences of actual developers using Apple Vision Pro after a year, featuring Gregoire (Piano App) and Cihat (Posters App). They share their likes and dislikes about developing apps for the device.
January 29, 2025
Is Anything “New” with the Galaxy S25?

Waveform: The MKBHD Podcast
Andrew, David, and Ellis discuss last week's TikTok chaos, Nintendo Switch 2 announcement, and new Samsung Galaxy S25 devices with a pre-recorded input from Marques. They recommend various apps they've been enjoying.
January 24, 2025

Ask this episodeAI Anything

Hi! You're chatting with Waveform: The MKBHD Podcast AI.
I can answer your questions from this episode and play episode clips relevant to your question.
You can ask a direct question or get started with below questions -
What was the main topic of the podcast episode?
Summarise the key points discussed in the episode?
Were there any notable quotes or insights from the speakers?
Which popular books were mentioned in this episode?
Were there any points particularly controversial or thought-provoking discussed in the episode?
Were any current events or trending topics addressed in the episode?
Sign In to save message history