Skip to main content

Home/ History Readings/ Group items matching ""money news"" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Javier E

Neal Stephenson's Most Stunning Prediction - The Atlantic - 0 views

  • Think about any concept that we might want to teach somebody—for instance, the Pythagorean theorem. There must be thousands of old and new explanations of the Pythagorean theorem online. The real thing we need is to understand each child’s learning style so we can immediately connect them to the one out of those thousands that is the best fit for how they learn. That to me sounds like an AI kind of project, but it’s a different kind of AI application from DALL-E or large language models.
  • Right now a lot of generative AI is free, but the technology is also very expensive to run. How do you think access to generative AI might play out?
  • Stephenson: There was a bit of early internet utopianism in the book, which was written during that era in the mid-’90s when the internet was coming online. There was a tendency to assume that when all the world’s knowledge comes online, everyone will flock to it
  • ...3 more annotations...
  • It turns out that if you give everyone access to the Library of Congress, what they do is watch videos on TikTok
  • A chatbot is not an oracle; it’s a statistics engine that creates sentences that sound accurate. Right now my sense is that it’s like we’ve just invented transistors. We’ve got a couple of consumer products that people are starting to adopt, like the transistor radio, but we don’t yet know how the transistor will transform society
  • We’re in the transistor-radio stage of AI. I think a lot of the ferment that’s happening right now in the industry is venture capitalists putting money into business plans, and teams that are rapidly evaluating a whole lot of different things that could be done well. I’m sure that some things are going to emerge that I wouldn’t dare try to predict, because the results of the creative frenzy of millions of people is always more interesting than what a single person can think of.
Javier E

The AI Revolution Is Already Losing Steam - WSJ - 0 views

  • Most of the measurable and qualitative improvements in today’s large language model AIs like OpenAI’s ChatGPT and Google’s Gemini—including their talents for writing and analysis—come down to shoving ever more data into them. 
  • AI could become a commodity
  • To train next generation AIs, engineers are turning to “synthetic data,” which is data generated by other AIs. That approach didn’t work to create better self-driving technology for vehicles, and there is plenty of evidence it will be no better for large language models,
  • ...25 more annotations...
  • AIs like ChatGPT rapidly got better in their early days, but what we’ve seen in the past 14-and-a-half months are only incremental gains, says Marcus. “The truth is, the core capabilities of these systems have either reached a plateau, or at least have slowed down in their improvement,” he adds.
  • the gaps between the performance of various AI models are closing. All of the best proprietary AI models are converging on about the same scores on tests of their abilities, and even free, open-source models, like those from Meta and Mistral, are catching up.
  • models work by digesting huge volumes of text, and it’s undeniable that up to now, simply adding more has led to better capabilities. But a major barrier to continuing down this path is that companies have already trained their AIs on more or less the entire internet, and are running out of additional data to hoover up. There aren’t 10 more internets’ worth of human-generated content for today’s AIs to inhale.
  • A mature technology is one where everyone knows how to build it. Absent profound breakthroughs—which become exceedingly rare—no one has an edge in performance
  • companies look for efficiencies, and whoever is winning shifts from who is in the lead to who can cut costs to the bone. The last major technology this happened with was electric vehicles, and now it appears to be happening to AI.
  • the future for AI startups—like OpenAI and Anthropic—could be dim.
  • Microsoft and Google will be able to entice enough users to make their AI investments worthwhile, doing so will require spending vast amounts of money over a long period of time, leaving even the best-funded AI startups—with their comparatively paltry warchests—unable to compete.
  • Many other AI startups, even well-funded ones, are apparently in talks to sell themselves.
  • the bottom line is that for a popular service that relies on generative AI, the costs of running it far exceed the already eye-watering cost of training it.
  • That difference is alarming, but what really matters to the long-term health of the industry is how much it costs to run AIs. 
  • Changing people’s mindsets and habits will be among the biggest barriers to swift adoption of AI. That is a remarkably consistent pattern across the rollout of all new technologies.
  • the industry spent $50 billion on chips from Nvidia to train AI in 2023, but brought in only $3 billion in revenue.
  • For an almost entirely ad-supported company like Google, which is now offering AI-generated summaries across billions of search results, analysts believe delivering AI answers on those searches will eat into the company’s margins
  • Google, Microsoft and others said their revenue from cloud services went up, which they attributed in part to those services powering other company’s AIs. But sustaining that revenue depends on other companies and startups getting enough value out of AI to justify continuing to fork over billions of dollars to train and run those systems
  • three in four white-collar workers now use AI at work. Another survey, from corporate expense-management and tracking company Ramp, shows about a third of companies pay for at least one AI tool, up from 21% a year ago.
  • OpenAI doesn’t disclose its annual revenue, but the Financial Times reported in December that it was at least $2 billion, and that the company thought it could double that amount by 2025. 
  • That is still a far cry from the revenue needed to justify OpenAI’s now nearly $90 billion valuation
  • the company excels at generating interest and attention, but it’s unclear how many of those users will stick around. 
  • AI isn’t nearly the productivity booster it has been touted as
  • While these systems can help some people do their jobs, they can’t actually replace them. This means they are unlikely to help companies save on payroll. He compares it to the way that self-driving trucks have been slow to arrive, in part because it turns out that driving a truck is just one part of a truck driver’s job.
  • Add in the myriad challenges of using AI at work. For example, AIs still make up fake information,
  • getting the most out of open-ended chatbots isn’t intuitive, and workers will need significant training and time to adjust.
  • That’s because AI has to think anew every single time something is asked of it, and the resources that AI uses when it generates an answer are far larger than what it takes to, say, return a conventional search result
  • None of this is to say that today’s AI won’t, in the long run, transform all sorts of jobs and industries. The problem is that the current level of investment—in startups and by big companies—seems to be predicated on the idea that AI is going to get so much better, so fast, and be adopted so quickly that its impact on our lives and the economy is hard to comprehend. 
  • Mounting evidence suggests that won’t be the case.
« First ‹ Previous 1241 - 1242 of 1242
Showing 20 items per page