featured image

AI the good, bad and ugly


3 min read

In recent times, especially since the successful launch of chatGPT to the general public in November 2022, there has been a frenzy around AI and AI powered tools. We have also witnessed significant capital flow readjusting from previous hot sectors such as web3 (down 76%) which had raised more than $10billion in 2021 to AI start ups.

AI which stands for Artificial Intelligence has been around since the mid 20th century and can be credited to the works of Alan Turin, a British computer scientist who in 1935 came up with the stored-program concept. The concept described a computing machine with unlimited memory that could scan through the memory symbol by symbol reading and writing further symbols following instructions stored in the same memory.

Over the years there have been significant advancements in the field of AI with AI being a common feature in social media platforms, e-commerce recommendation algorithms, fraud prevention systems, voice assistants, autonomous vehicles, email spam filters, robotics, healthcare e.t.c.

The mainstream proliferation of generative AI systems such as chatGPT, DALL-E, Midjourney and Bard has brought AI to the front line of computing and AI is expected to be included into more and more systems going forward.

The good

Artificial Intelligence therefore is a welcomed companion as it helps us to do our work better:

  • Tools such as chatGPT have been known to increase productivity if used well with precise, domain specific prompts.
  • Creatives can also utilise systems such as DALL-E to generate images based on text prompts.
  • In security, AI has been known to help combat fraud, spam and minimise BEC(Business Email Compromise).
  • In healthcare AI holds a lot of potential with the WHO recently issuing the first global report on AI and the six guiding principals to be followed when implementing Artificial Intelligence in healthcare.
  • In law enforcement, AI has/can be used to proactively prevent crime through behavioural pattern recognition on high risk individuals. Though this might be frowned at from a privacy perspective.
  • In manufacturing and assembly to control robotics and relieve humans from the monotonous repetitive tasks and heavy requirements.
  • In military aspect for threat detection and advanced warning systems.

The use cases and the potential benefits are really limited to what the mind can conceive.

The bad

While AI has its benefits, in life we always have two sided coins and we can’t have pros without cons. AI has been used in creative ways much to the disapproval of some quarters:

  • Deepfakes have been known to push fake news and can be a dangerous weapon in this age of social media propaganda.
  • AI has the capacity to replace some jobs, without proper consideration into the future of the replaced workforce, we might have a potential unemployment crisis when businesses pick cost over livelihoods.
  • ChatGPT can be tricked by specially crafted prompts to generate malicious code such as ransomware that has been tested to affect windows 10 computer and the worst part is that chatGPT can mutate the code to avoid detection on command.

The ugly

Up until recently, the dangers of AI were just but reasonable concerns from industry insiders. However, there has been new revelations of wormGPT.

We can describe wormGPT as the evil twin of chatGPT. WormGPT which was discovered by security researches in July 2023 is the chatGPT of hackers and scammers. The AI chatbot can be used to:

  • Generate convincing Business Email Compromise attacks while avoiding detection.
  • Craft convincing phishing emails.
  • Hackers jailbreaking AI chatbots so as to assist them in creating scam messages.
  • Generating malware that can be morphed to avoid detection.

In Conclusion

AI is a welcomed development and a step forward in the evolution of computing. Of course more can be done to ensure that safety is prioritised as further development is pursued. In the end, AI really is a beneficial technology that will eventually transform how we approach computing in general.