Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The generative AI is presented in many forms. Increasingly, however, it is marketed in the same way: with human names and characters that make it look less like code and more to a colleague. An increasing number of startups are anthropomorphing AI to quickly strengthen trust – and soften its threat to human jobs. It is dehumanizing and accelerating.
I understand why this framing has taken off. In the economy upside down today, where each hiring looks like a risk, corporate startups – a lot emerging of the famous accelerator there combiners – launch AI not as software but As a staff. They sell replacements. Assistants ia. Coders ai. AI employees. The language is deliberately designed to call on submerged job managers.
Some do not even care about subtlety. AtlogFor example, recently introduced an “AI employee for furniture stores” that manages everything, from marketing payments. A good manager, he is delighted, can now run 20 stores at the same time. The involvement: you don’t need to hire more people – simply let the system evolve for you. (What happens to the 19 managers he replaces is not said.)
Consumer -oriented startups look at similar tactics. Anthropic called his “Claude” platform because it is a warm and trustworthy companion for a neuronal netless and disembodied net. It is a tactic straight out of the Fintech game book where applications like Dave, Albert and Charlie have masked their transactional patterns with accessible names. When managing money, he better trusts a “friend”.
The same logic has slipped into AI. Do you prefer to share sensitive data with an automatic learning model or your Besttie Claude, who remembers you, greets you warmly and almost never threaten you? (To open the credit, he always tells you that you chat with a “pre-formulated generator transformer”.)
But we reach a tilting point. I am really enthusiastic about the generator. However, each new “AI employee” has started to feel more dehumanizing. Each new one “Dein»Makes me ask myself when the real worlds of the world will push to be abstract in robots of moving employment.
Generative AI is no longer curiosity. Its scope is developing, even if the impacts remain vague. In mid -May, 1.9 million unemployed Americans received continuous unemployment services – the highest since 2021. Many of them were technological workers set up. The signals accumulate.
Some of us still remember 2001: A Space Odyssey. Hal, the computer on board, begins like a calm and helpful assistant before turning completely homicide and cutting support for the life of the crew. It is science fiction, but she struck a nerve for a reason.
Last week, the CEO of Anthropic, Dario Amodei One to fivepushing unemployment up to 20%. “Most [of these workers are] ignoring that this is about to happen ”, he Tell Axios. “It sounds crazy, and people just don’t believe it.”
You might say that it is not comparable to cutting someone’s oxygen, but the metaphor is not that in the distance. The automation of more people out of pay checks will have consequences, and when dismissals increase, the brand image of AI as “colleague” will be less intelligent and more insensitive.
Change to a generative AI occurs, whatever the way it is excited. But companies have the choice of how they describe these tools. IBM has never called its central computers “digital colleagues”. PCs were not “software assistants”; They were workstations and productivity tools.
Language still counts. The tools should empower. But more and more companies market something else, and it looks like an error.
We don’t need more IA “employees”. We need software that extends the potential of real humans, which makes them more productive, creative and competitive. So stop talking about false workers. Just show us the tools that help large managers to direct complex companies and that help individuals have more impact. That’s all that whoever really needs.