By:-
Syed Sajid Husain, BCA
4th Sem, 1st Shift
Today’s world is having a
great run of advancement in terms of technology that has blessed the lives of
people, easing their tasks with barely any human involvement and getting it
done efficiently within no time. Either it is your virtual personal assistant
of Google (known as Google now) on your mobile phone or intelligent devices
that controls the aircraft to fly freely in air without the pilot’s
involvement, risking millions of lives every day and night. It has been built
so smartly that it can handle almost every worst possible situation
and overcomes the fear of failure after being researched for so long.
The term “ARTIFICIAL INTELLIGENCE” defines the
scenario when the machine starts to imitate the intelligence of human minds and
functions they tend to perform near perfection. “AI” was found to help the
human race to a great extent with the help of less manpower in its
implementation at the earlier stage of it. This was discovered to provide an
academic discipline in 1956 and it
has seen various ups and downs in past decades since its evolvement. “AI”
research was mainly focused on reasoning, understanding logics, planning natural
language processing, perception and the ability to move and manipulate
objects.
It holds a rich history of the efforts put in its discovery. The first
working “AI” program was written in 1951
to run on the Ferrari Mark machine
of the university of Manchester:- A checkers playing program written by
Christopher Strachey and a chess playing program written by Dietrich Prinz and
after these programs, the road to success in this field was clear. In June 1963, MIT received a $2.2 million grant from the newly created Advanced
Research Projects Agency (later known as DARPA). The money was used to fund project MAC which subsumed the "AI Group" founded by Minsky and McCarthy five years earlier. DARPA continued to provide three million dollars a year until
the 70s.
Since then, various technologies have been made to make our lives
simpler than ever, thanks to the intelligence humans provided to the machines. Technological progress in the electronics
sector, such as higher speeds, reduced costs, and smaller sizes, result in
entirely new possibilities of automation and industrial production, without
which "Industry 4.0" would
not be feasible. In this aspect we can very much say that, “Let’s connect every asset and create these
digital environments,” then whether it’s a program designed for a simple video
game or a massive functioning of a spaceship.
Examples of “ARTIFICIAL INTELLIGENCE” are
in a massive number in aspect of the "Industry 4.0". it has played a vital role in making the
concept of “Internet of things” a
huge success. it is the phase where less muscle power will be used and more
brain will be put in. “AI”
is reaching a stage where it excellently converts the human ideas or natural language
and translate it into a worker’s ease to navigate their working functions of
the software. “AI” makes the software user friendly and let it understand the
user’s intention to communicate, which makes its easy to understand, leading to
higher productivity and fewer errors which is an excellent sight from the
industry point of view. European
companies are currently leading the charge in the digital transformation of industry,
as in Munich, Germany, one of the finest automobile manufacturers “VOLKSWAGEN” are a step ahead in
building smartest cars of them all running around the whole world. “Artificial
intelligence is revolutionizing the automobile industry,” CEO of Volkswagen ‘Herbert
Diess’ said in a press release issued. And “in just few years” every new
vehicle will be having features with likes of “AI assistance for voice, gesture,
and face recognition” Said CEO of Nvidia, Mr. Jensen Huang.
Majority in today’s world of technology, “Artificial intelligence” is actually machine learning –that
analyzes the previous data and helps in improving it for the future terms. Modern
systems are guiding the path to allow workers in making people’s dream products
that they expect in reality from the respective firms rather than rolling out
ones which are somewhat similar to the existing products and persuading people
to purchase them.
It will effect the manufacturing in ways we have not imagined yet.
However, we can already look at some more obvious examples. With some more
obvious improvements in computer vision has long been used for quality
assurance by detecting product defects in real time. But now that manufacturing
involves more data than ever-coupled with the fact that plant managers do not want
to pay staff to enter data- AI with computer vision can streamline how data
gets captured.
But, after years of
brilliant discoveries on the concept of “AI”, it has started to threaten the
human employment sector. Now the question arises ‘why do you need humans to complete a specific task when a machine can
do it in no time and in a more systematic manner?’. The answer is well
within in front of our eyes as it was only humans that worked on these
languages or technology that was working better than them and even robot in Facebook developed its own language to
communicate with each other which forced the developers to shut down that
specific robotic program. Doesn’t it sound dangerous? Indeed it was for us as
it could have been cause of destruction and surely went on to demolish the
lives of people in the future period.
Sadly, we are heading to
a world where we will be living blindly in robotic blessings and all we’ll be
doing in the coming future is just admiring the brilliance and intelligence of
devices which will limit the man’s involvement in every possible task we are
performing in our daily lives near to insignificance and then the day will not
be far away from reality when robots will walk around this world while writing
off the terminology of human existence.
0 comments:
Post a Comment