Debug: Database connection successful
You are not logged in.
The article at the link below is from MIT Technology Review, and it is about the entire AI industry. However it does mention ChatGPT so I thought this might be a suitable topic for it: https://www.technologyreview.com/2024/0 … wtab-en-us
The article is about how AI may (or probably will not) influence productivity of global economies in the near future.
The article quotes economists who are pessimistic in the short term but bullish over the long term, if things go "right".
The question is still very much out whether builders of large language models will make them useful for manufacturing, where they have (apparently) almost no penetration.
The article is by David Rotman
Business
How to fine-tune AI for prosperityArtificial intelligence could put us on the path to a booming economic future, but getting there will take some serious course corrections.
ByDavid Rotmanarchive page
August 20, 2024
The article is on the lengthy side.
(th)
Offline
Like button can go here
This post is about ChatGPT Plus....
Some time ago, I decided to enlist in ChatGPT+, and I'm being billed by OpenAI but the bill just shows up and there is nothing sent to me to provide detail, so aside from seeing the money disappear each month, I am less than certain I am actually receiving the service.
On the ** other ** hand, whichever version of ChatGPT I am working with says it is version ChatGPT4o, and that is better than the free ChatGPT 3.5.
Meanwhile, a relative has finally decided to take the plunge and enlist in ChatGPT Plus. This just happened, and he confirms he sees all (or at least some) of the enhanced capabilities that come with the Plus subscription, but at this point, he has not checked to see which version ChatGPT thinks it is.
The relative was getting into difficulties because ChatGPT 3.5 was becoming confused when dealing with a very complex question that the relative and I are working on. I am hoping the upgrade will make a difference.
(th)
Offline
Like button can go here
This post is about large language models, of which ChatGPT is just one example....
https://www.newyorker.com/science/annal … wtab-en-us
The article at the link above reveals that the astonishing performance i see every day in ChatGPT4o (the paid version, not the free one) was achieved by accident, and (apparently) no one involved at the time or since has any idea how the systems work or what they are doing.
All the speculation we see about "coding" turns out to be meaningless....
Here's the tale end of the article:
The element of accident in the transformer’s outsized success has evoked a notable humility in its inventors. When I asked Parmar how she would rate our current understanding of the models developed with the transformer, she said, “Very low.” I asked, How low? Ten per cent? One per cent? She shrugged: “How do we understand other humans? It will be the same with the models.” It’s fitting that the architecture outlined in “Attention Is All You Need” is called the transformer only because Uszkoreit liked the sound of that word. (“I never really understood the name,” Gomez told me. “It sounds cool, though.”)
We’ve never had a truly “other” language before—a new and alien form of discourse that understands in a way we can’t understand. It’s unsurprising, then, that some of the people who were present at the creation are agog at the technology they brought into being. The production of A.I. seems to carry a powerful side effect: as the machines generate intelligence, they also generate mystery. Human misunderstanding endures, possibly a permanent condition. ♦
I hope that NewMars members who venture an opinion on what large language models are or how they work will be inspired with a bit of humility.
The creators of the phenomenon are humble enough to admit they have no idea what is going on, and may never.
(th)
Offline
Like button can go here
I haven't seen this message for a while ...
The engine is currently overloaded. Please try again later.
I'd ** just ** received advice in support of the proposition that if we want to serve animation to readers of the book we are working on, we need to place the processing load at the local machine. I have ** some ** experience with JavaScript (as reported in a topic with that word in the title) so asked if JavaScript might be enlisted to do animation. ChatGPT4o confirmed this would work, and it generated a sample html that can be served by Flask.
We ** just ** got Flask working, and I am thus encouraged to think that we might be able to scale this service, despite the limitations of the minimal Azure account we have in place right now.
(th)
Offline
Like button can go here