Frank Floor Talk: AI’s journey to dominance

Tuesday, November 25, 2025 8:00 AM
  • Other
  • John G. Brokopp, CDC Gaming

Artificial Intelligence and its many applications, specifically for our collective purposes in the gaming industry, caught this columnist by surprise as if science fiction suddenly became reality.

Have you ever been involved in a discussion with friends or co-workers about a topic you knew little about but were embarrassed to admit and just went along for the conversational ride hoping your ignorance would not rear its ugly head?

Your humble scribe found himself in need of an “Artificial Intelligence for Dummies” course while wandering aimlessly through conversations, picking up bits and pieces of information, but unable to put it all together into a meaningful and impactful understanding.

Submitted for your review are some of the things I learned about the developmental history of AI and its power to have a profound impact on our lives in the present and the future.

IBM defines AI as the “use of computers and machines to mimic the problem-solving and decision-making capabilities of the human mind”.

Britannica reveals that AI dates back at least 2,700 years according to Adrienne Mayor, research scholar, folklorist, and science historian at Stanford University, who said “Our ability to imagine artificial intelligence goes back to ancient times. Long before technological advances made self-moving devices possible, ideas about creating artificial life and robots were explored in ancient myths.”

Mayor makes a disturbing conclusion by observing “Not one of the myths has a good ending once the artificial beings are sent to earth. It is as if the myths say it is great to have these artificial things up in heaven and used by the gods. But once they interact with humans, we get chaos and destruction.”

Alan Turing (1912 – 1954) was a British mathematician who is credited with performing the earliest work on AI in his 1948 report “Intelligent Machinery”. He contributed to breaking the Nazis’ Enigma Code during WW II and created the “Turing test” to determine if a computer has the capability of thinking.

According to Britannica, Turing conceived of an “imitation test” which proposed that if a computer could answer questions posed by a remote human interrogator in such a way that the interrogator could not distinguish the computer’s answers from those of a human subject, then the computer could be said to be intelligent and to think.

Turing predicted that by the year 2000 a computer “would be able to play the imitation game so well that an average interrogator will not have more than a 70-percent chance of making the right identification (machine or human) after five minutes of questioning.”

John McCarthy (1927 – 2011), an American mathematician and computer scientist, is credited for being the “Father of Artificial Intelligence”. He coined the term “artificial intelligence” in 1955 as “the science and engineering of making intelligent machines”.

McCarthy, the genius behind the building blocks for AI, created the computer programming language LISP, which is still used in AI. He also created computer chess games and the first computer with “hand eye” capability.

He founded the Stanford Artificial Intelligence Lab (SAIL), one of the leading research facilities in the AI field, at Stanford University.

LISP was developed around 1960 by McCarthy at MIT for use primarily in AI. It declined in popularity during the 1990s but has recently gained traction, particularly in the open-source community. A LISP program is a function applied to data rather than a sequence of procedural steps.

The very first AI program developed to mimic how human beings solve problems came from Allen Newell, J.C. Shaw, and Herbert Simon, who collaborated in 1955-1956 to create Logic Theorist. Their inspiration was derived from Principia Mathematica (1910 -1913) written in partnership by Alfred Noirth Whitehead and the esteemed philosopher Bertrand Russell.

In 1958, Frank Rosenblatt invented the Perceptron, which he said was “The first machine which is capable of having an original idea”. Despite critical reviews from skeptics, it was to earn a reputation as the “foundation of all artificial intelligence”.

The ultimate tribute to Rosenblatt came from Melanie Lefkowitz of Cornell University decades later when she wrote “Professor’s Perceptron Paved the Way for AI – 60 Years Too Soon.”

Marvin Minsky (1927 – 2016) is introduced by Britannica editors as one of the most famous practitioners of the science of artificial intelligence. In 1969, he was the recipient of the A.M. Turing Award, the highest honor in computer science, for his pioneering work in AI.

Minsky and John McCarthy co-founded the Artificial Intelligence Project, which later became known as the MIT Computer Science and Artificial Intelligence Laboratory.

Minsky defined AI as “the science of making machines do things that would require intelligence if done by men”. Early AI researchers were perplexed in their efforts working with the syntax of even the most powerful computer programming languages available at the time.

He developed the concept of “frames” in 1975, which were used to identify the general information that must be programmed into a computer prior to considering specific directions. Frames proved to be an extremely valuable concept among AI researchers.

After more than a century of pioneers paving the way for AI applications, research and development is now progressing at such an astonishing rate that there is no telling what the future holds.

John G. Brokopp is a veteran of 50 years of professional journalist experience in the horse racing and gaming industries