Can China's AI tech compete with the US?

Looking at or should I call you Jen, Jen, AI, maybe I can do that this week, a lot of big earnings coming through. What are you most curious to see from these these big tech giants? David, Yvonne, great to be back. Thank you very much for having me. Right now both in the US and China ecosystem, the AI space is racing and the breathtaking speeds in terms of the US of course in the infrastructure side in in terms of a semiconductor and also a large model as in software infrastructure. China's is still playing catch up a little bit and in but in terms of application, China's speed is second to none. So I don't want to steal Kaifu Thunder, but I think today probably the the the race change a little bit in terms of performance. A lot of those earnings impact actually going to come down to including NVIDIA. I know David is 1. Your favorite topics is going to be impact about how efficient those models going to be, how well performed they're going to be and how much data processing they require. So I think not only just in the past few months we have witnessed all the big tech both in US and China ecosystem issue, a lot of their own models, but also we'll start to see this competition of efficiency starts as well. Kaifu, yeah, as Jen just mentioned, maybe the game has changed a little bit with now today your new model. Tell us a little bit about Wang Zhi, is, is this really China's answer to ChatGPT and why do you think some of the other previous sort of ones that we've seen chat bots in China have not been good enough, right. So we are launching today our new model called E Large which is the model that has API being released throughout the world. We intend to be a global company. So not just for China but for China. We've developed an app on top of E Large, so E large is to GPT 4 as Wanju is to ChatGPT. We now have both 01 dot AI. My company was founded only one year ago and when we started we were at least 7-8 years behind the Open AI. But now I'm proud to say that our latest model Elarge is performing comparably with GPT 4 by third party evaluations and on top of a powerful model is really the only way to support a product with product market fit that will excite users bring about the ChatGPT moment. So we are hopeful that our model with its performance will provide a chat bot that will rival ChatGPT for China. But we also think the model will support all kinds of apps throughout the world. It it looks like we are bracing for just explosive growth these next at least this next year. So Kai Fu Li and Jen I want to get your thoughts on this, but Kai Fu, let me just get your thoughts on this. First, you you mentioned it, Kai Fu Li, you know, barely 12 months old and you're in the market. This applies to other players in this market to the fact that the fact that barriers to entry are low, which perhaps is a good thing. How do you expect competition to change these next 12 months or so? You should we expect more competition and how and how are you looking to innovate your product ahead of perhaps competition coming in Kaifu, right. I actually don't think barriers are low. You don't see these great models and GPD like products and everywhere in the world there may be half a dozen companies that are in a similar situation in US, China, perhaps France and other countries. So we are one of the few and certainly the Chinese companies have achieved this level of performance at breakneck speed. And that is the essence of Chinese entrepreneurship is, first, we don't have as many GPU's. So we need to figure out how to use them efficiently. So for example, at 01 dot AI, we train our models at 1/2 or 1/3 the cost of comparable companies because we don't have access to the latest GP US And also we have many, many, many fewer of them compared to Open AI. And despite these challenges, we've reached GP for performance. So we're quite proud of that. I think many of our Chinese peers have also done quite well. So I think a lot of the saying in media that China is way behind is not accurate. Now that said, companies like Meta, Google, Microsoft are putting fifty 100 times more resources in this. So we certainly don't take it lightly. So as we catch up with the currently best model, we realize that even better models will come from open air and others. And we want to stay as close as we can. But we also think it's about building a great user experience, is really understanding what the users want and using that, which is China's advantage. As Jen said earlier, if you think about Chinese applications, TikTok is better than Instagram. Products like Temo and Xi'an have taken over the world. Users love it. I think China's ability to develop great applications that focus on what users want and that product market fit is a unique attribute to Chinese companies. Kaifu, you mentioned about GPUs, I was, I was going to ask you, it seems like you guys have been basically loading up on these NVIDIA chips. You know, you foresaw that this was going to come. I think we spoke to you back in November saying you may have maybe a year, a year and a half worth of supply left. What do you think is going to happen after that? You know, do you think that, you know, basically generative AI is going to be for those with a deep pocket, you have to keep, you know, shelling out dollars for this or do you think that there's going to be these smaller sort of a newer business models that can actually evolve to give smaller startups here more access to these compute resources? Well, I think there are two questions here. One is accessibility to GP US. We're in fine shape right now. We, I mean we raised a lot of money, but it's a tiny percentage of what the top American companies have. So we're really more bounded not so much by what we can buy, but how much money we have. So we have to be extremely parsimonious and practical and really build the right models with the right size that fit the user's needs. Now one could certainly train a, you know with the with the resources that Google and Microsoft and Meta have, you know, train A50 trillion parameter model. But such a model will not run for everyday applications. Everyday applications require minimal latency, these very good accuracy and different applications require models of different sizes. So today we're also releasing a whole series of models from 6B to 9B to 34B to E large, which is over 100 B. And for every application we've demonstrated that are we have a model size that performs as well as if not better than the competition. So I think that demonstrates for real practical use, we at 01 dot AI are able to compete on a global basis. Now there could be a huge model that one of the top American companies developed that performs better, but it's still remains to be seen whether such a huge model with the corresponding required large amount of compute resource can be deployed in real applications that delivers return on investment and has a reasonable infrastructure and latency for the user. Jen, your reaction to Kaifu Lee opened up many doors there just. Yeah, yeah. First of all, I'd like to congratulate the Kaifu and his team for making such meaningful effort to trail Blaze China's soft software side of infrastructure. But also I want to just departure briefly depart from this US China framework for a little bit. What we're talking about is the community where it's very GPU rich and community GPU poor either because you are open source community or you have because of geopolitical issues, the starved from GPU and the most advanced GPU supplies the world actually vast majority of people in the tech industry not only just in the US, China framework in the global community, vast majority of players don't have a lot of GPUs like you know open AI etcetera. So the I see this actually an opportunity in terms of where AI should go because if you think about the large model right now, because most of the large models is still kind of black box, we don't really know which set of data really made a difference in terms of the algorithm to understand and make sense. Therefore one of the easier lazier way to approach this is to add more and more data, use more and more GPU. But if you think of our human brain, we are talking about the biggest model in the world is open eye ChatGPT 4. It's rumored more than one trillion parameters, but our human brain consumes 30 walk and we're processing 100 trillion parameters, so there's no comparison. I have always said this you don't get to Mars by building tall and taller. Building on Earth. So by building bigger and bigger model, consuming more and more GPU is not really the way to get truly flexible AI. So I actually think what Kai Fu's team, a lot of companies and developers their pursue to more efficient and smaller model is the future. Microsoft just issued A53 as well using you know 7 billion parameters actually outperformed much larger models. So Open my Open AI yeah. You mentioned this emergence of smaller their language model and what what what implications is that going to have for these GPU companies whether it's their share price and this whole race right now I wrote on my LinkedIn a while ago, I when Nvidia's price is like highest, I actually did not believe the price will stay that high because precisely for the reason I mentioned. But also if you take a look at the what's happening in the global taxing right. I think people realizing, you know in social media age when you share you thought you were sharing your information pictures to your friends and family. You're actually sharing with Mark Zuckerberg. But now people are realizing your most intimate questions you're asking Chad Chad GB U GB T is actually not private conversation. You chat with a private owned commercial company Open AI run by Sam Admin. That's kind of creepier. I think so. So, So I think what's going to happen because of this combination of geopolitical situation, GPU distribution, I think this kind of smaller model localize the AI local agent is going to happen. A couple of trends need to notice here as we have already well addressed in that in China, GPU is becoming a resource that's very difficult to access. However, if you take a look at China's smartphone, smartphone like for example 1 + 12 is selling 2424 G and one terabyte phone and 900 U.S. dollars. And China's EV market is developing so fast right? It took China 2027 years to produce the 1st 10 million EV's but you only took seventeen months to produce past the 20 million mark. So all these EV's, they are tablets on wheels. So what's happened is that all these very, very powerful individual applications, they they they will have the capability to hold their own data locally processed locally. Of course there's a trade off in terms of efficiency and performance. But eventually I think those kind of alternative, I suppose very centralized approach like Open AI will start to flourish. It will become much more popular outside of the US system. Kaifu, can I bring you in on that? Just your thoughts on the, the trends around efficiency that you think we should be paying attention to not just this year but in the years to come. Yeah, certainly I I think the we're currently very fixated on the training market, which is how to take a whole world's data and train giant models. And for that you do need the giant cluster and currently NVIDIA is the best solution. But as Jen said, inference is a very different story. When you actually deploy these models, bigger is not necessarily better. You want to be able to look at the apps that you're building. If you're building something with deep reasoning skills that generates great content, you do need a large model. But for a chat bot, customer service and games, you do not. So there needs to be a model for every occasion and that's why 01 dot AI, we've deployed a models of different sizes and the AP as eyes are being made available globally at very competitive prices. I think what's important, an indicator important to watch is really the cost of inference. I think that is the gating factor assuming people always find good enough technology for each application use. It's all about writing down the cost of inference. We see inference costs dropping about 10 times a year. So that means things that seem impractical today will be a widespread in one year at most in two years. So I think a proliferation of GPUs, models and agents and applications that run on all different sciences, I think will be the key to the next phase of development of LMS. Jen, when does safety come in? When should safety come in? Conversations around. So yesterday, I think the the problem with safety right now, it's actually not so much a lack of discussion, but it's a lack of understanding with the discussion. There's a lot of distraction in terms of inflating what AI can do and therefore actually destroy what what need to happen. My personal belief, I'm a huge proponent of open source and keep localized and have distributed approach instead of centralized approach which is that all the large tech is doing right now and end of the day when it comes to safety, everybody have very different and very nuanced approach, right. We, we, we all see our privacy in the very different way. So leaving to a handful few companies or even for regulators to decide every single aspect to what individual should use or access and express and interact with AI is limited. So therefore I think the more we have the open source approach, localized approach, distributed approach combined with regulation and combined with this kind of leadership with with value and the concern of the society from the large companies is probably the the recipe, the best recipe you can have right now. Yeah and Kafa maybe to give you the last word here this, this potential for AI to create havoc on the world. I mean what what's the counter argument to that for those are concerned that AI may be an existential threat? Well, one concern is that AI will take over the world like Terminator Terminator. I would say that's completely unjustified. AI is a tool that we use. Now I do agree that AI is very, very powerful and can be used by the bad guys that will bring about very serious harm, for example for terrorist groups and things like that. So it's important to put all the check checks and balances in place and safety measures. I would point out that every great technology always came with big risk. Electricity brought about risk of electrocution, people invented circuit Breakers, Internet on PC brought about viruses, people invented antivirus software. So that I would say now is the call to duty for all the people working on LMS to not just focus on bigger, better model, but having a renewed focus on safety. And I also believe open source is important and we are open sourcing everything except our biggest model today as well.

OTHER NEWS

8 minutes ago

Fury as NYC liquor store owner is charged for shooting shoplifter who swiped a bottle of Ciroc vodka

8 minutes ago

ESPN NFL reporter Ed Werder brutally axed for second time in seven years

9 minutes ago

ICC’s Netanyahu warrant: Letters to the Editor — May 23, 2024

9 minutes ago

Classic Fortnite Weapon Seems To Be Returning In Season 3

9 minutes ago

Thomas Tuchel has already told Man Utd he'd have a problem with ex-Red Devils duo

9 minutes ago

Scottie Scheffler case: Officer broke rules by not turning on body camera in golfer's arrest, chief says

9 minutes ago

Leaked emails reveal oil companies privately betraying their pledges, misleading public: 'The time for accountability is now'

9 minutes ago

Vince Fong wins California special election to fill Kevin McCarthy’s seat

9 minutes ago

Not Opening! Ab de Villiers Picks Virat Kohli's Batting Position At T20 World Cup

10 minutes ago

England head coach questions Morgan criticism over 50-over World Cup fiasco

10 minutes ago

Netflix's New Conspiracy Thriller Doubles Down on a Weirdly Popular Genre

10 minutes ago

Steelers' Patrick Queen Earns Big Time Praise From TJ Watt

10 minutes ago

“Survivor 46” finale recap: A stunning vote from a juror tips the scales

10 minutes ago

JERA urges govt support or it will quit Australia

11 minutes ago

China begins 2 days of military exercises near Taiwan as ‘punishment’

11 minutes ago

Saskatchewan 10-year-old headed to Braille Challenge Finals in Los Angeles

11 minutes ago

N.B. lieutenant-governor does not need to be bilingual, says Court of Appeal

11 minutes ago

Elections Manitoba announces voting details for Tuxedo byelection

11 minutes ago

‘I Am: Céline Dion’: Emotional trailer shows singer’s health struggles

13 minutes ago

NATAS Reveals 2024 Gold and Silver Circle Honorees; WBD’s Kathleen Finch Among Recipients

13 minutes ago

Judd Apatow Signs With WME

14 minutes ago

MATT GOODWIN: Rishi Sunak's only chance is to win back Boris's voters. If he does, he could stop Starmer from ruling

14 minutes ago

Just Eat is giving away free food for breakfast, lunch and dinner this Bank Holiday weekend - see the full list of deals

14 minutes ago

Prepping for the end of the world? Here's what scientists say you REALLY need - as ministers urge Britons to start national crisis 'emergency kits'

14 minutes ago

Video: My boyfriend is 42 years older than me - our relationship started when he delivered me a kebab

15 minutes ago

Melissa Gorga confirms she is still feuding with her sister-in-law Teresa Giudice as husband Joe Gorga calls his sister the 'queen of toxicity'

15 minutes ago

Elise Stefanik Has Gained Widespread Attention in Antisemitism Hearings

15 minutes ago

Officer involved in Scottie Scheffler arrest violated body-worn camera policy: Chief

15 minutes ago

Nvidia, Synopsis rise; Live Nation, V.F. Corp. fall, Thursday, 5/23/2024

15 minutes ago

Biden campaign met with Haley voters to woo them following her pledge to vote for Trump

15 minutes ago

Garland calls Trump's false claim of Biden DOJ plan to potentially assassinate him during FBI's Mar-a-Lago search 'extremely dangerous'

15 minutes ago

Super Bowl or bust? Lions' Dan Campbell unsure 'what bust is'

15 minutes ago

New York will set aside money to help local news outlets hire and retain employees

15 minutes ago

Officer disciplined for bodycam off in Scottie Scheffler case

15 minutes ago

Cop disciplined for bodycam being off in Scottie Scheffler case

15 minutes ago

11 best greens powder supplements for a daily dose of goodness, tried and tested

15 minutes ago

Most Iconic Rockstar Games Characters

15 minutes ago

2024 French Open draw: 14-time champion Rafael Nadal handed nightmare draw in first round

15 minutes ago

Trey Lance 'Crossing Thresholds' at Cowboys OTAs

15 minutes ago

James White believes Patriots could surprise people in 2024

Kênh khám phá trải nghiệm của giới trẻ, thế giới du lịch