X
Innovation

Sorry, general AI is still a long, long way off

Artificial intelligence might have passed a school science test but when everyday tasks are still well beyond it's ability, we can't even talk about building general purpose AI.
Written by Mary Branscombe, Contributor

For the last few weeks, we've been watching a plant grow on our windowsill. A seed blew into the window box and took root, and started to shoot up. 

There was nothing growing in that end-of-the-window box, so we left it until we could see whether it was a weed or a nice plant. 

The seed had been long and black, and the stem grew tall and spindly. Once we could see a few leaves, I started searching the web for a plant with a long, hairy stem and long, pointed leaves springing alternately from the stem, that grow in the UK from long black seeds, that are pointy at one end and round at the other.

SEE: How to implement AI and machine learning (ZDNet special report) | Download the report as a PDF (TechRepublic)

If you described that to a botanist or a gardener, they would tell you immediately that it was probably a sunflower, but I didn't get any useful results from searching by the description. In fact, none of the lists of UK plants with hairy stems or alternate leaf-growth patterns that I did find included the sunflower. 

It wasn't until we could see the flower forming and looking very like a sunflower that I could search for 'sunflower hairy stem' and get a description telling me that sunflowers have long, hairy stems and leaves growing alternately from the stem. Once I knew what I wanted, the machine learning behind the search engine could tell me about it, but it couldn't take my description and tell me what I was looking at.

There are lots of reasons why the search didn't find anything; search engines cover everything, so I couldn't make it a specific botanical search, let alone a specific search limited to the south of England. Even technical terms like 'leaves borne alternately from the stem' rather than 'leaves in pairs on both sides' wouldn't help, because they're both full of common words. And search engines are designed to find answers rather than questions ('what is a sunflower', as they say on Jeopardy).

I tried a few similar searches: a curlew is "a brown bird with a long bill that curves downwards", but searching for that found me web pages where I could look up birds by hand rather than actual birds. Yes, search engines started out as being tools to find web pages, but more and more they're about finding information that's on the web rather than necessarily going on to look at the site the information comes from. 

Bing search interface with search results for "brown bird long beak curves down"

I'm describing a curlew, but none of the results say "curlew".

Mary Branscombe

Once the flowerhead had developed enough to look like a sunflower, I took a photo and tried a reverse image search. This found a lot of pictures of garden ornaments, tree stumps and deer in woodlands, because they look like a picture of a plant against a garden background. Bing did slightly better than Google here; you can paste an image rather than having to upload a file, so I could crop it more closely and get 'something darker against trees' rather than 'house on a green lawn'.

the reverse image search UI in Bing

Bing's reverse image search lets me crop down to the unknown plant - but it doesn't find anything at all similar.

Mary Branscombe

TinEye, a reverse image search service that looks for exactly the same image - handy when you want to find where a picture on Pinterest originally came from - couldn't find anything similar at all. I could have tried selecting the plant and erasing the background, but a human wouldn't need you to do that before you showed them a photo of the plant you wanted to know about.

I didn't see a picture of a sunflower bud as the result of my search because the image recognition systems used by search engines haven't been trained to recognise different plants, especially not against a background of other plants. You can train image recognition on a specific domain, like dog breeds or faults in the components you manufacture, and it will do well at recognising them; but the same system may not do as well identifying types of fruit or makes of car. 

It's hard to combine deep, domain-specific knowledge with deep knowledge about another domain, or even with general recognition.

There's plenty of research into extracting more information from documents; machine-reading comprehension creates questions and answers from passages of text, so a system that's read botanical texts would have an answer to 'what plants in the UK have hairy stems and long leaves borne alternately from the stem'.

When the Aristo AI system recently passed tests for eighth graders (13-14 years old) and high school seniors (17-18 years old), the exam didn't include any of the usual questions based on interpreting pictures and diagrams. It also skipped essay questions that needed the system to write an answer in its own words - and it couldn't tackle areas of science outside the topics it trained on. The techniques used in the Aristo system are certainly a step forward in natural language understanding, information extraction, common sense knowledge and applying concepts in ways that look like reasoning, and the team is working on answering questions using the kinds of images used in science tests - diagrams, maps, charts and the like. But even the project's long-term goal of "a machine that has a deep understanding" would still be a specific form of AI rather than general intelligence.

There's also plenty of research into 'common knowledge'; people know that stems and leaves and buds are all parts of plants, but an AI system would need a knowledge graph that covers parts of plants. Concept graphs try to capture these kinds of relationships to understand semantic representations: a laptop is a kind of computer, Microsoft and Google are major technology companies, Beijing and Seoul are both large cities in Asia, so they belong in a list with Tokyo rather than with London and Paris.

READ MORE: Harnessing evolution with AI

Again, those tools are going to do well on specific domains, like particular areas of science or your own company departments and products and policies. We'll start seeing more services that spider over your corpus of documents and your org chart and do useful things with them, as well as tools for scientists that try and find relevant prior work. One machine-learning system has already been able to extract the structure of the periodic table and spot functional applications for various materials by mining research papers. That could help put novel materials into use earlier, or make it easier to adopt techniques from nature in engineering projects.   

So what do all these examples mean? For me they suggest that while progress is being made with AI, there is a long, long way to go. Don't be fooled by the useful tools we do have (and by all the human-powered services pretending to be AI, whether that's as the first step of training machine-learning systems or just pretending a service is software when it's really underpaid gig workers).

When doing something that a human expert would have no problem with is still so far out of reach - not just for our random sunflower but for hundreds of thousands of other everyday questions - we're not anywhere close to even having the tools to talk about building general purpose AI.

Editorial standards