انت الان في قسم هندسة تقنيات الحاسوب

نشر مقالة علمية للتدريسي(م.م علي حسين شامان) في قسم هندسة تقنيات الحاسوب بعنوان Artificial intelligence and its impact on everyday life تاريخ الخبر: 31/05/2023 | المشاهدات: 209

مشاركة الخبر :

In recent years, artificial intelligence (AI) has woven itself into our daily lives in ways we may not even be aware of. It has become so pervasive that many remain unaware of both its impact and our reliance upon it.
From morning to night, going about our everyday routines, AI technology drives much of what we do. When we wake, many of us reach for our mobile phone or laptop to start our day. Doing so has become automatic, and integral to how we function in terms of our decision-making, planning and information-seeking.
Once we’ve switched on our devices, we instantly plug into AI functionality such as:
• face ID and image recognition
• emails
• apps
• social media
• Google search
• digital voice assistants like Apple’s Siri and Amazon’s Alexa
• online banking
• driving aids – route mapping, traffic updates, weather conditions
• shopping
• leisure downtime – such as Netflix and Amazon for films and programmes
AI touches every aspect of our personal and professional online lives today. Global communication and interconnectivity in business is, and continues to be, a hugely important area. Capitalising on artificial intelligence and data science is essential, and its potential growth trajectory is limitless.
Whilst AI is accepted as almost commonplace, what exactly is it and how did it originate?
What is artificial intelligence?
AI is the intelligence demonstrated by machines, as opposed to the natural intelligence displayed by both animals and humans.
The human brain is the most complex organ, controlling all functions of the body and interpreting information from the outside world. Its neural networks comprise approximately 86 billion neurons, all woven together by an estimated 100 trillion synapses. Even now, neuroscientists are yet to unravel and understand many of its ramifications and capabilities.
The human being is constantly evolving and learning; this mirrors how AI functions at its core. Human intelligence, creativity, knowledge, experience and innovation are the drivers for expansion in current, and future, machine intelligence technologies.
When was artificial intelligence invented?
During the Second World War, work by Alan Turing at Bletchley Park on code-breaking German messages heralded a seminal scientific turning point. His groundbreaking work helped develop some of the basics of computer science.
By the 1950s, Turing posited whether machines could think for themselves. This radical idea, together with the growing implications of machine learning in problem solving, led to many breakthroughs in the field. Research explored the fundamental possibilities of whether machines could be directed and instructed to:
• think
• understand
• learn
• apply their own ‘intelligence’ in solving problems like humans.
Computer and cognitive scientists, such as Marvin Minsky and John McCarthy, recognised this potential in the 1950s. Their research, which built on Turing’s, fuelled exponential growth in this area. Attendees at a 1956 workshop, held at Dartmouth College, USA, laid the foundations for what we now consider the field of AI. Recognised as one of the world’s most prestigious academic research universities, many of those present became artificial intelligence leaders and innovators over the coming decades.
In testimony to his groundbreaking research, the Turing Test – in its updated form – is still applied to today’s AI research, and is used to gauge the measure of success of AI development and projects.
This infographic detailing the history of AI offers a useful snapshot of these main events.
How does artificial intelligence work?
AI is built upon acquiring vast amounts of data. This data can then be manipulated to determine knowledge, patterns and insights. The aim is to create and build upon all these blocks, applying the results to new and unfamiliar scenarios.
Such technology relies on advanced machine learning algorithms and extremely high-level programming, datasets, databases and computer architecture.