3 episodes taggedApproximate match across all podcasts
Home/Tags/WATCH MEMORY

WATCH MEMORY

All podcast episode summaries matching WATCH MEMORY β€” aggregated across every podcast we track.

3 episodes Β· Page 1/1

Quotes & Clips tagged WATCH MEMORY

21 on this page

Sundar spends a dedicated hour weekly tracking compute allocation by team

β€œBut now it is really acutely constrained. Right? So you spend a lot more time. I at least spend a dedicated hour a week thinking about that question at a pretty granular level. So I will know by projects and by teams the compute units they are using. Right? And, you know, or or at least I have that information, and I'm looking at it and assessing it.”

β€” Sundar Pichai - CEO of Google and Alphabet

Google is in early stages of building data centers in space

β€œWe are constantly trying to take these long term projects, which when you first announce them, slightly marginally looks ridiculous. Okay. You know, like, we're in the earliest stages of thinking about data centers in space. But to your earlier discussion around constraint inspires creativity. But if you take a twenty year outlook, right, where are you going to put most of these data centers? Really hard problems to solve.”

β€” Sundar Pichai - CEO of Google and Alphabet

Preferred stocks serve as supplemental income tools

β€œTypically, people want to invest in preferred stocks because they have a bit of an income focus. They typically pay higher fixed dividends than common stocks and often have some pretty attractive yields, anywhere from 5% to 8%, and so that makes them pretty attractive for these steady cash flow seekers. But that also changes where they are as well with respect to their position in the capital stack. They sit above common equity, meaning if a company goes bankrupt, you would get some claim on assets before common stock shareholders do, but they are below bonds as well.”

β€” Luke Guerrero

Library version changes expose limits of context-only learning

β€œImagine your favorite JavaScript library, like, let's say, React. Right, you you learn through all of your pretraining data that there is a function called x. But at some point, a new version of React comes out and turns out that it's a breaking change, and all of a sudden, x function doesn't exist. It's now a y function. No matter how much you say it in the context, you cannot just override what's the most intuitive throughout all of the model parameters to basically say x.”

β€” Malika Aubakirova

In-context learning works but hits a real ceiling

β€œAny honest argument about continual learning pretty much has to start with in context learning because it genuinely works. We see that with examples like Karpathy's auto research project. Kind of like the other examples we give in the article is OpenCloud. Like, the underlying model was available to anyone, but, what's really made it a special magical, moment is, this kind of like orchestration of the context.”

β€” Malika Aubakirova

Out-of-distribution learning post-deployment is the key milestone

β€œThe test that some people use currently is pretty simple. You basically you train a model that is learned on x y z data. And once you deploy, you just want to check whether it learns something out of distribution, something that it hasn't seen before. And we are starting to see some examples like the test time training done by Yusan, with the discover paper that kind of makes some of the novel inventions.”

β€” Malika Aubakirova

High bandwidth memory demand drives Micron growth

β€œMicron Technology has had quite, quite a positive performance recently. It is a leading semiconductor memory and storage company, so they make DRAM, NAND, Flash, and high bandwidth memory. It's one of the few major global suppliers in an industry that has a lot of high barriers to entry, not just because of input costs, but also because of the required technological understanding and ability to develop these very complex and small products. This HBM, it's high bandwidth memory, right? This is really what's been driving it here.”

β€” Luke Guerrero

Younger AI-native companies have a structural advantage over incumbents

β€œI think your question earlier on, like, you know, I think you were asking in the context of way more robotics, like, companies. I do think companies which are that's one advantage startups are gonna have. More AI native teams. And and, you know, you can probably get at it through your interview processes, etcetera. Whereas for us, we would have, like, retraining, transformation etcetera. And I think that that's maybe an advantage, like, the younger companies are gonna have.”

β€” Sundar Pichai - CEO of Google and Alphabet

Industrials show resilience despite shifting economic narratives

β€œTicker XLI is the State Street Industrial Select Sector SPDR ETF... Even if the economy is starting to turn, you're not really seeing signs of a breakdown in industrials and certainly not seeing a sign of a breakdown in this ETF. A lot of what has happened in the market this week has been centered around the narrative of AI, AI spending, and really the lack of monetization across the board and hasn't really spilled over into other sectors. Unless you're overweight a ridiculous amount relative to what the market weight is of industrials, I don't see any reason why you would want to fully exit this industrials ETF.”

β€” Luke Guerrero

Memory is the most acute supply constraint in 2026

β€œMemory is definitely one of the most critical components now. There is no way that the leading memory companies are going to dramatically improve their capacity. So you have those constraints in the short term, but they they get more relaxed as you go out. By the way, I think it'll push a lot of innovations on. We will make these things 30 x more efficient.”

β€” Sundar Pichai - CEO of Google and Alphabet

Adversarial security requires updating weights, not prompts

β€œThe first one is essentially in adversarial security. Like, imagine there is a new jailbreak attack. You have your model deployed in the wild, and it's being used. Imagine you try to update your system prompt to say, like, don't do this. Like, it's not going to work. Right? Because all of the, like, parameters in the model have learned to be helpful to the users. So you really have to encompass that kind of knowledge in the weights where the attackers don't have access to.”

β€” Malika Aubakirova

Sundar increased Waymo investment when others got pessimistic

β€œWaymo was a great example where I think we increased our investment two to three years ago when the rest of the world got pessimistic on it. When others, some of the people are backing off. For example, if Waymo had reached this point earlier, I think I would have invested the capital earlier. But I would have been glad to invest more capital in Waymo earlier, but we weren't at the level of maturity needed to do that.”

β€” Sundar Pichai - CEO of Google and Alphabet

Transformers were built to solve product problems, not just research

β€œTransformers was done in the context of a lot of, like, like, TPUs, transformers were all done to solve a specific product need to some extent. Right? Like, the team's thinking about how to make translation better. In the case of TPUs, how do you, pay speech rec works? We suddenly have to serve it to 2,000,000,000 people. We don't have enough chips for it.”

β€” Sundar Pichai - CEO of Google and Alphabet

Gemma 4 model weights fit on a USB stick

β€œI'm coming here as we just shipped Gemma four. And, it's a really good open source model. The frontier to Gemma four is both huge and not so huge in terms of time. Like, of, Gemma four is based on Gemini three architecture. You know, it's a very weird thing. Right? You're talking about a set of ways which can fit on a USB stick.”

β€” Sundar Pichai - CEO of Google and Alphabet

Google was built for the AI moment but had to execute better

β€œHey. The overton window shifted. We have like, I felt like the company was built for that moment. The vertical thing, it's it's it's not an accident or something. It was a very intentful we were on the seventh version of TPUs. So to me, we were behind in terms of frontier LLM models, but we had all the capabilities internally, and we had to execute to meet the moment.”

β€” Sundar Pichai - CEO of Google and Alphabet

The ultimate test is models that learn on the job like humans

β€œWe humans are not AGI, but we still learn on the job. We learn from experience, and that's what makes kind of humans kind of unique. And so that's kind of like the ultimate test. Like, how do we define that that we got to continue learning? It's like, well, is there a system that is able to learn on the job and get better through use just like humans?”

β€” Malika Aubakirova

Software volatility stems from AI monetization concerns

β€œWhat really drove the market today? Well, it was the feeling that there was a bit of a bottoming in software. It was a focus as groups attempted to break seven straight declines. Bit of a pick up in commentary. We saw a lot of discussion about very oversold conditions and some pushback against some of the more dire predictions revolving around that AI competition that was part and parcel for why the market was doing what it was doing. In addition, you had elevated hyperscaler capex, still a positive for the broader AI trade.”

β€” Luke Guerrero

Search latency budgets are measured in single-digit milliseconds

β€œBut to give an example, like, search, you know, I was speaking with the teams. Right? Like, they now have for sub teams, like, latency budgets, like, in the milliseconds. You'll get 50% credit if so if you ship something which, you know, shaves off three milliseconds, you earn one point five milliseconds for your latency budget, and one point five milliseconds gets passed on to the user.”

β€” Sundar Pichai - CEO of Google and Alphabet

AI models today are frozen like the amnesiac in Memento

β€œThe main protagonist, Leonor Shelby, has a form of this amnesia where he cannot form new memories. So he goes about his life, with kind of, like, this cut off date after which point he has kind of these long term memories, but really cannot retain anything new that he experiences. And so what he does is he uses the sticky notes where he writes some of the notes to himself. He pulls out his Polaroid camera to capture the moments as he goes on about his life. And, I mean, he even tattoos some of the, memories that he wants to imprint in his memory.”

β€” Malika Aubakirova

Learning happens across context, modules, and weights

β€œWe make this, like, very high level framework in terms of just the three buckets of the context, the modules, and also the weights. And the distinction that like, the one callout that I think is important is all of these are learning mechanisms. And even in context learning, it's still a form of continual learning. But context is essentially what we call nonparametric learning, where we don't actually update the weights.”

β€” Malika Aubakirova

AI data centers trigger a utility super cycle

β€œWe've all seen how tech earnings calls have really highlighted a new risk. There isn't enough electricity to power the new data centers. We'll talk about the utility super cycle as power companies raise rates to build this new capacity. We'll also touch on a story I saw today about KPMG pressing its auditors to cut costs, or rather to cut the price they charge KPMG due to AI cost savings. What does that mean for the audit industry and really industry as a whole?”

β€” Luke Guerrero

More clips tagged WATCH MEMORY?

Get a daily email of the best quotes & audio clips from the top podcasts.

Subscribe for daily Quicklets