The Weird Thing About ChatGPT That No One Is Talking About

The Weird Thing About ChatGPT That No One Is Talking About:- Imagine you had a personal Search Assistant who can not only track down answers in a fraction of a second but good break down complex topics, offer personalized recommendations, and even do your work for you. It’s a scenario you might not have to imagine for too long because Microsoft through ChatGPT is working to make it a reality as soon as possible.

The search engine hasn’t changed much since its launch nearly three decades ago. Sure they are more efficient but for the most part, they still function the same way.

You enter your query in a text box, press Enter and then scroll through a list of hyperlinks with websites that host the answers to your questions.

Most of the time this is fine but often finding the information you need can be a difficult experience. Google has improved its search engine to produce instant answers to basic questions like what is the capital of France. But for more complex topics you still have to sift through multiple websites to find what you’re looking for.

This is what chatGPT is trying to change, in case you’ve somehow avoided the internet over the last few months and don’t know what chatGPT is. It’s a hyper Advanced chatbot created by the artificial intelligence research laboratory OpenAI, capable of having realistic human-like conversations.

It’s a type of artificial intelligence known as a large language model or LLM. Programs like these have actually existed for a long time dating all the way back to the mid-1960s.

Although these earlier versions were nowhere near as sophisticated they used rigid pre-programmed formulas that created an illusion of genuine communication but were severely limited in their range of possible responses.

AI
AI

What sets chatGPT apart, is its ability to hold fluid free-flowing dialogues with its users. They can successfully navigate the non-linear speech patterns of everyday conversation ask follow-up questions reject inappropriate requests and even admit when it’s made a mistake and correct themselves.

Essentially chatGPT is an incredibly sophisticated autocomplete system, that predicts which word should follow which in a given sentence. There’s no coded set of facts it’s drawing from it’s simply trained to create the most plausible-sounding response.

ChatGPT has grown to over 100 million monthly users just a month after becoming publicly available, a faster rate of adoption than any other technology in existence worldwide. People are using it to reply to emails, check software code, write articles, and even prepare their tax returns.

For all the amazing things it’s done through chatGPT hasn’t been without controversy. One of the scariest things about the rise of AI (ChatGPT) is that a lot of people are sadly going to lose their jobs.

Chatgpt itself told me that jobs like dead entry, clerks, bank tellers, and assembly line workers are at risk of being taken over by automation. In light of this, it has become more important than ever to learn High skills that cannot be easily automated out of existence.

Plagiarism is skyrocketing, as schools and college students are now using the program to write their school papers for them, leading many commentators to declare it the death of the essay. In another somewhat ironic twist, the Popular Science Fiction magazine Clark’s world was forced to close its open submissions after being floated with a wave of AI-generated short stories.

More concerning though is how the program is being used to replace workers. Media giant BuzzFeed laid off 12 of its employees last December and since then managers that outsourced some of this labor to chatGPT. BuzzFeed CEO Jonah Peretti has stated that going forward AI will play a larger role in the company’s operations and they’re not the only ones.

Microsoft was one of the Openai’s earliest backers and last month the check giant committed to a multi-year 10 billion dollar investment. Both are currently integrating ChatGap with Microsoft’s flagging search engine Bing.

The hope is that through the power of artificial intelligence being will deliver faster more accurate results while also being able to complete more complex tasks like tutoring kids or organizing your schedule.

Actually, it will not be a search engine but it will be a personal assistant with encyclopedic knowledge. Though the AI-powered version of Bing isn’t available to the general public yet it’s already triggering a migration away from Google.

In response Google executives recently declared a Code Red corporate emergency, prompting them to bring their own AI search engine to market. Google’s AI assistant is named Bard and it’s actually been in development for years.

Unfortunately, it isn’t quite ready to meet the public just yet and it’s a much-anticipated demo back in February. The AI made several faux pas including incorrectly attributing the recently launched James Webb Telescope with taking the first photos of a planet outside our solar system.

This feat was actually accomplished in 2004 by the European Southern Observatory’s Very Large Telescope. The Bard costs Google 100 billion in market value and Essence prompted the company to open up the system to wider testing.

Bard’s error highlights a huge problem with AI-powered search engines that not many people are talking about. Something that could pose a menacing threat to society if not handled properly. Rather than delivering a list of relevant links and other pertinent information to sort through Bard and chatGPT are only offering a single answer to any query.

John Henshaw the director of search engine optimization for Vimeo says this makes these programs more inefficient compared to Conventional search engines and more dangerous. In an interview, Henshaw said with conversational AI”I think Society has the most to lose”.

Having a takeover search means people will be spoon-fed information that is limited homogenized and sometimes incorrect. It’ll affect our capacity to learn and will suffocate the open web as we know it and it’s not just a matter of these programs returning inaccurate results. In the most extreme cases, they’ve actually conjured entire data sets seemingly out of nowhere.

One of the weird examples of this happened when a reporter asked ChatGPT to write an essay about a Belgian chemist and political philosopher who never actually existed. However, that didn’t stop ChatGPT from writing an entire biography on the fictional character full of fabricated facts.

AI experts refer to this kind of phenomenon as hallucinating and no one is certain why it happens. Even Chatgpt creators can’t say how they came up with this information.

FAQ:-

How many languages can ChatGPT understand?

Multilingual

Can ChatGPT translate languages?

Yes

Can ChatGPT write stories or articles?

Yes

Can ChatGPT generate images or videos?

Limited

Read More: Basic Step to use ChatGPT Chatbot 2023

1 thought on “The Weird Thing About ChatGPT That No One Is Talking About”

Leave a Comment