You are currently viewing Don’t Use ChatGPT For These Five Tasks

Don’t Use ChatGPT For These Five Tasks

Summary: Don’t use Language models like ChatGPT and BERT for tasks such as medical, legal, or financial advice, online interactions, or writing articles. They generate text based on input to sound human but have limitations including a limited understanding of context, potential bias in text, and lack of expertise. Using language models for such tasks can lead to inaccurate results and ethical issues.

 

Language models like ChapGPT, which is built with OpenAI’s GPT-3, or BERT by Google are intended to generate text to sound human based on input provided. However, with the popularity of these systems, it is vital to understand their limitations.

 

You can use them well to carry out functions like handling customer service questions, text completion and suggestion, text summary and translation, analysis, information retrieval, test-to-speech and speech-to-text, and many more. Yet, there are many things it cannot do, first, look at its limitations here.

 

text generation algorithm representation
text generation algorithm representation. Image Source: Nieman Journalism Lab.

Also read:
Computer Giant Dell To Lay Off Over 6,000 Employees
Google Begins Testing ChatGPT AI Competitors In-House
Bitcoin Price Drops In February After January Surge.

Limitations of ChatGPT and other LLMs

  1. They have limited understanding of context, so while the text they generate may be grammatically correct, they may not be accurate.
  2. These engines consume lots of data and could regurgitate text that shows bias and is offensive, something a human writer of editor knows to avoid.
  3. They lack expertise in any specific field so, while they know lots of things, they are not experts.
  4. Without the ability to think, their results are usually repetitive and could make less sense sometimes.

Don’t use ChatGPT or other language models for…

Don't use ChatGPT for medical advice and other tasks that require high espertise and accuracy
Don’t use ChatGPT for medical advice and other tasks that require high espertise and accuracy. Image Source: iStock.
  1. Medical advice: There are tons of data on medicine online and many of these have been fed into language models to make it easier for generating text. However, since it is not an expert in the field of medicine, its results may be wrong based on inaccurate data you supplied or outdated data.
  2. Legal advice: Like the above, these systems lack the necessary expertise, despite their ability to answer several law-related questions. Also, they are not up to date with new developments in the field and their inability to provide advice peculiar to the particular case because of not taking all necessary factors into consideration.
  3. Financial advice: Finance has a lot to do with local regulations and some of these laws may be out of the language model’s training data set, making it an inaccurate source or advisor.
  4. Online interactions: Interacting well with people requires you understanding their situation and language models likely ignores this and may interact with people the same way, which may lead to offending someone.
  5. Writing articles: Many writers are turning to ChatGPT and other LLMs to speed up the writing process but trusting these systems completely with this kind of task may lead to inaccuracies and ethical issues that it is unaware of.

 

For your daily dose of tech, lifestyle, and trending content, make sure to follow Plat4om on Twitter @Plat4omLive, on Instagram @Plat4om, on LinkedIn at Plat4om, and on Facebook at Plat4om. You can also email us at info@plat4om.com and join our channel on Telegram at Plat4om. Finally, don’t forget to subscribe to OUR YOUTUBE CHANNEL.

Onwuasoanya Obinna

A reader of books and stringer of words. Passionate about Science and Tech. When not writing or reading he is surfing the web and Tweeting.