91³Ô¹Ï

91³Ô¹Ï and AI: Faculty members on their research and its impact

Photo of Goldie Nejat
“Human movement and behaviour are unpredictable, so we are seeing if robots can adapt and react to that,†says Goldie Nejat, an associate professor in the Faculty of Applied Science & Engineering

More time to interact with patients, improved efficiency in health-care delivery, and new global communities of exchange and communication: According to 91³Ô¹Ï researchers, these are just some of the positive outcomes of artificial intelligence.

“AI has the power to change the way we live and work, making more human capital available to focus on creativity and innovation,†says Cristina Amon, dean of the Faculty of Applied Science & Engineering.

Artificial intelligence and automation, and the impact of advanced technology on business strategy and workforce demand and supply, is one of the themes that will be discussed at this year’s Ontario Economic Summit. Together with leaders in the private, public and not-for-profit sector, 91³Ô¹Ï researchers and entrepreneurs will be showcasing how they are seizing these opportunities.

Work at 91³Ô¹Ï is already demonstrating how advances in machine learning can accelerate gains in health, education, communication and quality of life, as well as the importance of educating students to think about the ethics of the technology.

Fahad Razak, assistant professor at the Institute of Health Policy, Management and Evaluation
Team member, General Medicine Inpatient Initiative (GEMINI)

What makes some patients more likely to have a poor prognosis after being discharged from hospital?

The answer lies somewhere in the three billion data points now included in the General Medicine Inpatient Initiative (GEMINI) health database, says Razak, who is also an internist at St. Michael’s Hospital.

GEMINI is the result of a team of physicians, public health advocates, and data scientists working to standardize much of the information hospitals collect about patients across seven Ontario hospitals. By tracking and comparing data on admission, discharge and tests, among other metrics, GEMINI will lead to more accurate predictions about the outcomes for individual patients and the ability to intervene to improve them.

Three billion individual pieces of data “is the kind of volume we need to develop predictions,†Razak says.

But the process of standardizing the data from different hospitals has also revealed gaps in the information now collected.

Because of privacy and confidentiality concerns, hospitals don’t collect income, education or second-language status data. Those characteristics, however, are important variables affecting health, Razak says.

“Many problems have to be overcome before the data can become a public policy tool.â€

Read more: 91³Ô¹Ï has 'key role to play' in developing and implementing new AI technologies

 

Goldie Nejat, Canada Research Chair in Robots for Society; associate professor, Faculty of Applied Science & Engineering

After reading about or seeing the robots being produced in Nejat’s lab, people have written her personal letters.  

“I’ve been asked, ‘How can I get one of these robots? A family member has dementia and I can’t leave them alone,’†says Nejat.  

The socially assistive robots are designed to help seniors improve their quality of life. They can score bingo cards or play memory games in nursing homes, or help prepare meals at home.

Nejat’s research team is now working on robots that can switch between all of those tasks seamlessly, the way a human can, she says.

“Human movement and behaviour are unpredictable, so we are seeing if robots can adapt and react to that,†Nejat says.

The hope is that by taking over routine tasks, the robots will free personal-care workers to spend more time interacting with the people they care for in meaningful ways.

Ashton Anderson, assistant professor in the department of computer and mathematical sciences at 91³Ô¹Ï Scarborough and the tri-campus graduate department of computer science

The availability of data about how millions of people are using online technology to communicate with each other is an unprecedented development for the social sciences, says Anderson.

Anderson teaches computational social science to undergraduates and graduate students, covering topics like crowdsourcing, the interactions of human and algorithmic decision-making, and bias and ethics in computational systems.

“There is no other subject that is most relevant to the giant companies,†Anderson says.

“Facebook and Google are basically graph algorithms that the students are learning. … They come out from my class with concrete skills they can apply to social data.â€

Ethics in computing is another area that is receiving increasing attention, he says. Graduate students have always grappled with those issues, but now undergraduates are exposed to the topic as well.

Anderson points to questions about how Facebook data is collected and stored by app developers as an example.  “I have a lecture on ethical issues in social network mapping which is new, and it’s there because of the issues happening outside [the classroom].â€

Avi Goldfarb, Rotman Chair in Artificial Intelligence and Healthcare
Co-author, Prediction Machines: The Simple Economics of Artificial Intelligence

AI has enormous potential to improve the quality of human life, Goldfarb says. He predicts jobs will be focused on tasks that require emotional understanding, dexterity and judgment, while robots will take on the routines of life.

Continued education can help support everyone’s ability to participate in a world shaped by the fourth industrial revolution.

“It is important to have a social safety net to help those who fall behind and to make changes to adult education and lifelong learning so that people who need to acquire new skills are motivated to do so,†he says.

At the same time, public policies will have to encourage competition and innovation, and protect individual privacy and choice.

 â€œPrivacy policy will have to get the balance right between encouraging the development of AI through intellectual property, liability and trade policy, and protecting privacy and excessive concentration of commercial AI companies.â€

Elizabeth Dhuey, associate professor of economics, 91³Ô¹Ï Scarborough

How does a student take an evening course in data analytics if they do not have child care at night? And how do they choose between investing in their career and paying for daycare?

Those are the kinds of questions that will increasingly confront individuals and institutions as lifelong learning becomes a necessity in the labour market, says Dhuey, a labour economist who is one of three faculty members leading the , a project from the Munk School of Global Affairs & Public Policy.

“It will be difficult to be involved in lifelong learning when you have child-care needs, or you have to figure out how to finance courses,†she says.

Working with organizations in the private and public sector, including financial institutions, school boards and universities, the Future Skills team members are analyzing large data sets, such as those collected on the Lynda.com learning platform, to make recommendations on what kind of learner supports are required.

“There is a lot of focus on identifying and naming the skills we need, but we are not talking about some of the big questions, about child care, and about financing.â€

Brian Cantwell Smith, Reid Hoffman Chair in Artificial Intelligence and the Human, Faculty of Information

Advances in artificial intelligence are leading to more people asking the question that has kept Cantwell Smith busy for several decades: Are we really that different from a machine?

If the human mind shares the same architecture as that of a computer’s artificial neural network, what makes us unique?

His provisional conclusion is that even if human and artificial circuits operate according to the same rules, some characteristics are ours alone. Wisdom is one example, he says.

A computer, for example, may be able to offer a more accurate cancer diagnosis than a doctor.

“If someone says, ‘Look, I have two kids, and if I do two years of chemo, I will get three years of life, are people generally happy with that choice? How do I decide?’ The AI is not going to have that kind of judgment,†he says.

Such questions cannot be left to experts to decide, Cantwell Smith says. A $2.45-million gift from Reid Hoffman, the co-founder of LinkedIn, will help design public engagement strategies on these issues.

 â€œTo be an educated person in the 2020s you will have to understand these things at various levels of depth. I am trying to put into circulation ideas that give people a vocabulary to talk about the whole phenomenon.â€

 

GRO