Features

Artificial Intelligence: Our New Superpower?

A closeup of a man鈥檚 chest and he pulls a white button down shirt to either side showing the letters AI underneath, similar to Superman.

Some people find generative artificial intelligence, like ChatGPT, threatening, but it holds great promise. [Illustration: John Pelerossi, USF Advancement]

By TOM WOOLF

DID YOU ASK ALEXA TO SET A TIMER while you were cooking dinner last night?

Start the day with the weather forecast courtesy of Siri?

Artificial intelligence 鈥 AI for short 鈥 has become so intertwined with our daily lives that most of us probably don鈥檛 give it a second thought. Need help planning a trip? What about getting that last-minute birthday present delivered tomorrow? Stymied by a problem at work, or with homework?

How we think about AI has changed dramatically over the past year with the advent of generative AI chatbots ChatGPT, Google Bard, Bing Chat, DALL-E2 and more.

What is generative artificial intelligence? Just ask Bing Chat.

鈥淕enerative artificial intelligence is a type of AI that can generate new forms of creative content, such as audio, code, images, text, simulations and videos. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics.鈥

That was Bing Chat鈥檚 response to this prompt: Define generative artificial intelligence in 50 words.

As the technology evolves at breakneck speed, the question being asked across all sectors of society is whether AI poses a threat or holds promise.

Prasant Mohapatra, USF鈥檚 provost and executive vice president of academic affairs, says the answer is both.

鈥淲e have to do some trade-offs between threat and promise,鈥 says Mohapatra, an accomplished researcher in wireless networks, mobile communications, cybersecurity and internet protocols. 鈥淚t has the potential to be very positive, but if generative AI is used in inappropriate ways, it may have unintended 鈥 or intended 鈥 severe consequences.鈥

Mohapatra co-chairs the Generative AI Strategic Planning Group, created by USF President Rhea Law and composed of faculty, staff and administrators to develop guidelines for using the technology. Members are exploring AI鈥檚 role in teaching and learning, as well as operationally in such areas as human resources, admissions, and business and finance.

鈥淎I is not magical, it is based on fundamental aspects of science, and we want our students to learn about the foundations of AI and how it can be leveraged in a positive way,鈥 Mohapatra says. 鈥淎t the same time, we have to make sure the future generation of leaders 鈥 our students 鈥 learn about the negative aspects, that if it is not used in a proper way, it can do more harm than good.鈥

On the operations side, Mohapatra says, 鈥淭hings are moving so quickly, we have to make sure that the AI tools we are using meet our requirements for performance and accuracy.鈥

There also are ethical considerations, such as in the purchase of products and services.

鈥淲e should not have vendor selection software driven by AI that is biased toward only large corporations,鈥 he says.

Sidney Fernandes, MS 鈥00, vice president of information technology and chief information officer at USF, co-chairs the strategic planning group. Generative AI has taken the world by storm and it鈥檚 very promising, he says. But he also has concerns.

鈥淭here has to be a lot of effort by those using AI to ensure that it is used ethically, that it is used with a fair degree of skepticism of the answers it provides and that it is used as an assistant to the human being, not as a replacement,鈥 he says.

As an example, Fernandes referred to recent cases where other universities used plagiarism detection software to determine whether essays submitted by students were original work or AI-generated.

鈥淚n some cases, students who were caught cheating were wrongly accused because AI has some implicit biases, especially toward non-native speakers,鈥 he says.

Jenifer Jasinski Schneider, 鈥89 and MA 鈥92, a professor of literacy studies and president of the Faculty Senate, also serves on the planning group. She says faculty members have expressed mixed reactions to generative AI.

鈥淪ome are very interested in it and see the potential and are excited about learning about it,鈥 she says. 鈥淥thers are very suspicious and concerned.鈥

While she can 鈥渟ee all sides to it,鈥 Schneider views AI as a tool with great promise.

鈥淚鈥檝e been using it in a variety of ways, such as developing a course and in writing emails,鈥 she says. 鈥淚鈥檝e asked it questions to see what it knows, whether it鈥檚 accurate. A colleague and I have been playing around with how we query it, because how you prompt it changes the responses you receive.鈥

She feels a sense of urgency when it comes to AI in the classroom.

鈥淲e have to make sure our students are prepared for the workforce,鈥 she says, 鈥渢hat they understand what business, or medicine, or education, or social science is doing with AI.鈥

Worries about students using AI to plagiarize 鈥渟hould just be taken off the table,鈥 she adds. 鈥淎I is here to stay. As educators, perhaps we need to think about how we use writing as a catchall assignment to demonstrate knowledge. Maybe we could use alternative methods.鈥

Kobe Phillips, a senior majoring in ecology and evolutionary biology and a member of USF鈥檚 Judy Genshaft Honors College, uses an AI design tool in one of his classes 鈥 something that is encouraged by his professor. Phillips believes AI is a 鈥減henomenal resource鈥 for students and faculty.

鈥淭here is so much potential in this space, be it for helping students to create study guides or create code or new solutions when they couldn鈥檛 think of one, or even for professors for generating questions,鈥 Phillips says. 鈥淎s students, we want to put our best foot forward. We are here to learn and these new resources could transform our ability to learn.鈥

IS IT ART WITHOUT HEART?

Two framed images, one a still life of fruit overlaid with digital lines, the other a pixelated image of the Mona Lisa, float on a background of circuitry.

Generative artificial intelligence can render beautiful images, but it can鈥檛 generate ideas and direction 鈥 that鈥檚 what artists bring. [Illustration: John Pelerossi, USF Advancement]

Heather Sellers is an accomplished poet. An award-winning author of books, short stories and essays.

And a disrupter.

鈥淲riters and artists are welcoming of disruption,鈥 says Sellers, director of USF鈥檚 creative writing program in the College of Arts and Sciences. 鈥淎rtificial intelligence is a great disruption. I think we see our own role in society as disrupters, to ask questions and push out of the way how things have been done in order to move things forward in a new way.鈥

Sellers, who has taught at the college level for 30 years, including the past 10 at USF, says that at this point, AI cannot write a beautiful poem or create a novel with depth and meaning. But that day may be coming.

If it does, 鈥淚t鈥檚 going to completely change our understanding of the human experience,鈥 she says. 鈥淲hat I think is so important about the humanities being at an inflection point like this is our engagement with the ancient questions that have always governed our discipline: What is it to be human? What is it to feel? What is it that is important and to be cherished in the human experience that needs to be fed into AI and fed back to us so there鈥檚 a synergy in the relationship? It鈥檚 going to continue to evolve, but I don鈥檛 find that threatening.鈥

As she has experimented with generative AI tools, she has found they can鈥檛 do what poets do.

鈥淎I is not able to render the complexity and depth of the human experience and put those into language that鈥檚 beautiful and meaningful,鈥 Sellers says. 鈥淲hat鈥檚 exciting in the classroom is to be able to show students, when you ask AI to write a poem in the style of Robert Frost, why it isn鈥檛 able to do that. There are a lot of aspects of meaning-making and language and the human experience that are beyond its capabilities.鈥

It can be helpful with formulaic writing like letters of recommendation and program reviews, she notes. But while poetry and great novels use form, they are not formulaic.

AI stretches the imagination and can be a great collaborator, says McArthur Freeman II, an associate professor of animation and digital modeling in the School of Art and Art History.

He often employs technology for films and games in his creative efforts. He鈥檚 also a sculptor, which starts with digital models, and a painter.

鈥淲ith AI, the question has to do with what we bring to the table,鈥 he says. 鈥淎I has no feelings, it鈥檚 not invested, it has no desire. One type of AI can render images in very exquisite and beautiful ways with lighting and texture, but what it doesn鈥檛 do is generate ideas and a direction. That鈥檚 what artists bring, their perspective.鈥

AI has been shown to reflect bias, and that concerns Freeman. He recalls the experience he and his wife had when they asked a generative AI tool to produce images of 100 physicians.

鈥淭hey were all white males and almost all of them had gray hair,鈥 Freeman says. 鈥淚t鈥檚 easy to look at it and think, 鈥業t鈥檚 a computer, it doesn鈥檛 have bias.鈥 All of these programs are trained off of the data that someone inputs.鈥

Sellers agrees that鈥檚 a problem, but it also creates an opportunity for her as a teacher.

鈥淔rom an educational standpoint, it鈥檚 very exciting to feed AI a prompt that you know is going to generate something that鈥檚 blatantly biased,鈥 she says.

She asked a generative AI program to translate 鈥渘on-binary鈥 into Spanish.

鈥淚t will say 鈥榥o binario,鈥欌 the masculine form, she says. 鈥淚f you ask it what that means, it says that depends on the gender of the person if it鈥檚 a male or a female. It just doesn鈥檛 know.

鈥淏ut that鈥檚 one of the things students come to a university for, to learn critical thinking, to be able to slow down and assess. I don鈥檛 think it鈥檚 any different than the way we鈥檝e been asking students to always consider the source since they were in fourth grade. It鈥檚 a source and it has incredible weaknesses. Bias is baked into everything, and this is a great opportunity to bring awareness to bias and authorship and ownership.鈥

Noting fears of widespread use of AI by students to cheat, Freeman says, 鈥淲e have a responsibility to prepare students to adapt and compete in a world that will be utilizing AI after they graduate. Rather than focusing solely on restricting its use, we need to find new strategies to employ AI as part of the learning process.鈥

New technologies can be useful tools and facilitate new ways of thinking, he says.

鈥淭hey prompt us to ask questions like, 鈥楬ow can AI enrich our understanding of the creative process and our role in it? What can it enable us to learn that was less accessible before? What do we need to strengthen in our own education to better leverage andcollaborate with these tools?鈥欌

AI, Freeman says, 鈥渃an reveal things that we haven鈥檛 been able to readily perceive, which allows us the opportunity to learn to see new things.鈥

IN SICKNESS AND IN HEALTH, A HELPFUL TOOL

A high-fidelity mannequin lies in a hospital bed while two nursing students practice listening to its 鈥渉eartbeat鈥 and other vital signs.

College of Nursing Dean Usha Menon says AI could improve training simulators, such as this child mannequin being used by USF student nurses.

Usha Menon recalls a moment early in her career while working with a more experienced nurse in a hospital maternity ward. Her mentor looked down at a tiny patient and told Menon that something was about to happen with this baby.

鈥淚 said, 鈥楬ow do you know? That baby looks perfectly fine,鈥欌 Menon says. 鈥淎nd she said, 鈥榃ell, the hair on the back of my neck is standing up.鈥欌欌

Years later as a cardiac care nurse, Menon would sometimes find herself hovering around certain patients鈥 rooms, braced for a crisis. 鈥淚 couldn鈥檛 quite say why. It was just a feeling.鈥

That is what generative artificial intelligence cannot mimic or replace.

鈥淲hen we think about machine capabilities, how do you program for that extensive experience and the gut feeling that humans bring to a situation?鈥 asks Menon, dean of the USF College of Nursing and senior associate vice president of USF Health. 鈥淚 don鈥檛 think you can.鈥

While Menon and other medical professionals say AI has its limits, they agree it鈥檚 already providing benefits. As demand for health care overwhelms the supply of providers, they welcome AI taking over administrative chores and other tasks that don鈥檛 require their expertise.

鈥淲e have an aging and sicker population, particularly here in the Tampa Bay region. We have 1,000-plus people a day moving into the state, and our ability to deliver care is increasingly stretched thin,鈥 says Dr. Nishit Patel, MD 鈥10, a professor in the USF Health Morsani College of Medicine, vice president of medical informatics for USF Tampa General Physicians, and vice president and chief medical informatics officer at Tampa General Hospital. 鈥淲e have to figure out, how do you serve those additional needs with the same or less of a workforce?鈥

The COVID pandemic accelerated the development and use of tools like USF Health鈥檚 patient portal, MyChart, which streamlines communications, freeing providers for more hands-on care. Patients use MyChart to schedule appointments, message with providers and request prescription refills, all from home.

鈥淲e went from about 330,000 patient portal messages pre-pandemic, in 2019, to almost 900,000 last year,鈥 Patel says. 鈥淵ou look at tools like generative AI because the promise of what鈥檚 there is to solve a problem that seemed impossible just a couple of years ago.鈥

Imagine AI sorting through patients鈥 medical records.

鈥淲e have tons of valuable information captured in your electronic health record, but many of those valuable insights are hidden away from physicians because of the sheer volume,鈥 Patel says. 鈥淕enerative AI has the potential to scrape the entirety of a patient鈥檚 chart and provide me with a high-yield summary of all of the key events and results since I last saw the patient.鈥

It doesn鈥檛 replace providers鈥 decision-making, Patel says, rather, it allows them to make better decisions more quickly.

For example, at Tampa General Hospital, surgical patients get speedier access to specialized nursing care thanks to AI.

鈥淲hen a patient is coming in for surgery, we have to think about what happens after the procedure, including how long they will need to be in a post-anesthesia care unit (PACU), if they will need to stay in the hospital afterwards and which type of specialized unit they need to be placed into for that type of surgery,鈥 Patel says. 鈥淏ed planning and capacity planning are incredibly complex activities that have historically required a lot of time and generally occurred the morning of the procedure.鈥

TGH has developed predictive models that have shortened PACU hold times by 28%, reduced bed planning time by 83% and shifted bed planning to days before the procedure. They have a 95% accuracy rate.

鈥淲hat this means for patients is that we have made all the necessary planning for their successful recovery before they even step foot into the operating room,鈥 Patel says.

But will the day come when algorithms make medical decisions?

鈥淭he fundamental practice of medicine remains the same,鈥 Patel says. 鈥淢edical decision-making still occurs at the cross section of data, experience and training, and that does not change with AI.鈥

In psychiatry, Dr. Ryan Wagoner, MBA 鈥22, has not seen widespread adoption of AI beyond patient scheduling. But he sees its potential. An associate professor and division chief of the Morsani College of Medicine Department of Psychiatry and Behavioral Neurosciences, Wagoner says AI may one day offer limited help in treating mental illness.

Primary care doctors, who treat most 鈥渟traightforward鈥 issues such as depression and anxiety disorders, might find AI鈥檚 algorithms useful in prescribing medications, Wagoner says. But for more complicated problem, patients will still need people.

鈥淓specially in psychiatry, very often people do not want to tell a computer 鈥楬ere鈥檚 why I鈥檓 feeling so awful about something鈥 or 鈥楬ere鈥檚 this unusual experience that I鈥檓 having,鈥欌 Wagoner says. 鈥淭hey want a human being to be able to relate to and provide some empathy to understand that shared human experience.鈥

If someone in a fragile emotional state seeks help from a chatbot and it responds inappropriately, what will be the impact? he asks. Mental-health professionals might also say the wrong thing, but they can read patients鈥 cues and switch gears.

鈥淭here are some stops in there that humans have whenever they see another individual鈥檚 emotional state headed in a certain way,鈥 Wagoner says. 鈥淎 chatbot won鈥檛 have that.鈥

AI can be an amazing tool with great potential to assist in health care, he says.

鈥淏ut I鈥檓 not looking for AI to replace what I do anytime soon.鈥

MORE WINNERS THAN LOSERS IN THE WORKFORCE

Is generative AI coming for your job?

Maybe. Maybe not.

The past year鈥檚 rapid growth of such tools as ChatGPT, Google Bard, Bing Chat and Dall-E2 has led to widespread speculation about which workers they may one day replace 鈥 and which new career options they may create.

Distinguished University Professor Sudeep Sarkar describes generative AI as 鈥渁 computing technology that is going to unleash the human potential. It is a tool that鈥檚 going to accelerate innovation and creativity.鈥

Sarkar chairs USF鈥檚 Department of Computer Science and Engineering and is co-director of the USF Institute for Artificial Intelligence + X. The 鈥淴,鈥 he explains, can apply to a wide variety of disciplines 鈥 business, biology, finance, public health, for example. Sarkar also is a member of the university鈥檚 strategic planning group that鈥檚 developing guidelines for the use of AI in teaching, learning and USF operations.

The technology has remarkable potential to transform work. But, Sarkar says, 鈥淚 haven鈥檛 seen companies saying, 鈥榃e are going to get rid of this job because generative AI can do it.鈥

鈥淲hat鈥檚 going to happen is that the nature of some jobs will morph,鈥 he adds. 鈥淪ome jobs will become larger in scope, while for other jobs, the nature of the work will shift.鈥

USF Innovative Education, working with faculty in the College of Engineering, has created an Artificial Intelligence Certificate program. It is fully online, catering to the needs of working adults seeking to upskill or reskill to advance their careers. Students learn how to design and deploy AI for real-world applications.

Sidney Fernandes, vice president of information technology and chief information officer at USF, says that when it comes to the impact of AI on jobs, 鈥淭here is no one-size-fits-all answer.鈥

There may be new opportunities in AI research and development, as well as positions focused on compliance security and ethical and responsible AI implementation. He also noted the executive order issued by President Joe Biden in late October regarding AI regulation and the need for transparency, suggesting the need for new skill sets in legal compliance and information technology.

Existing jobs that might be dramatically affected by AI include those involving repetitive tasks or processes that can be automated. Examples include data entry, some aspects of customer service, as well as entry-level white-collar jobs from technology, to legal, to human resources and health care.

鈥淚n all of these cases, the jobs will not go away,鈥 Fernandes says. 鈥淩ather, AI will be something of a tool for creating more efficiencies and better outcomes, as long as the users of the tools have a firm understanding of the limitations. 

鈥淭here will be the need for a human to review, make judgments and ensure that we do not ever trust the AI answers.鈥