Features
Artificial Intelligence: Our New Superpower?
By TOM WOOLF
DID YOU ASK ALEXA TO SET A TIMER while you were cooking dinner last night?
Start the day with the weather forecast courtesy of Siri?
Artificial intelligence â AI for short â has become so intertwined with our daily lives that most of us probably donât give it a second thought. Need help planning a trip? What about getting that last-minute birthday present delivered tomorrow? Stymied by a problem at work, or with homework?
How we think about AI has changed dramatically over the past year with the advent of generative AI chatbots ChatGPT, Google Bard, Bing Chat, DALL-E2 and more.
What is generative artificial intelligence? Just ask Bing Chat.
âGenerative artificial intelligence is a type of AI that can generate new forms of creative content, such as audio, code, images, text, simulations and videos. Generative AI models learn the patterns and structure of their input training data and then generate new data that has similar characteristics.â
That was Bing Chatâs response to this prompt: Define generative artificial intelligence in 50 words.
As the technology evolves at breakneck speed, the question being asked across all sectors of society is whether AI poses a threat or holds promise.
Prasant Mohapatra, USFâs provost and executive vice president of academic affairs, says the answer is both.
âWe have to do some trade-offs between threat and promise,â says Mohapatra, an accomplished researcher in wireless networks, mobile communications, cybersecurity and internet protocols. âIt has the potential to be very positive, but if generative AI is used in inappropriate ways, it may have unintended â or intended â severe consequences.â
Mohapatra co-chairs the Generative AI Strategic Planning Group, created by USF President Rhea Law and composed of faculty, staff and administrators to develop guidelines for using the technology. Members are exploring AIâs role in teaching and learning, as well as operationally in such areas as human resources, admissions, and business and finance.
âAI is not magical, it is based on fundamental aspects of science, and we want our students to learn about the foundations of AI and how it can be leveraged in a positive way,â Mohapatra says. âAt the same time, we have to make sure the future generation of leaders â our students â learn about the negative aspects, that if it is not used in a proper way, it can do more harm than good.â
On the operations side, Mohapatra says, âThings are moving so quickly, we have to make sure that the AI tools we are using meet our requirements for performance and accuracy.â
There also are ethical considerations, such as in the purchase of products and services.
âWe should not have vendor selection software driven by AI that is biased toward only large corporations,â he says.
Sidney Fernandes, MS â00, vice president of information technology and chief information officer at USF, co-chairs the strategic planning group. Generative AI has taken the world by storm and itâs very promising, he says. But he also has concerns.
âThere has to be a lot of effort by those using AI to ensure that it is used ethically, that it is used with a fair degree of skepticism of the answers it provides and that it is used as an assistant to the human being, not as a replacement,â he says.
As an example, Fernandes referred to recent cases where other universities used plagiarism detection software to determine whether essays submitted by students were original work or AI-generated.
âIn some cases, students who were caught cheating were wrongly accused because AI has some implicit biases, especially toward non-native speakers,â he says.
Jenifer Jasinski Schneider, â89 and MA â92, a professor of literacy studies and president of the Faculty Senate, also serves on the planning group. She says faculty members have expressed mixed reactions to generative AI.
âSome are very interested in it and see the potential and are excited about learning about it,â she says. âOthers are very suspicious and concerned.â
While she can âsee all sides to it,â Schneider views AI as a tool with great promise.
âIâve been using it in a variety of ways, such as developing a course and in writing emails,â she says. âIâve asked it questions to see what it knows, whether itâs accurate. A colleague and I have been playing around with how we query it, because how you prompt it changes the responses you receive.â
She feels a sense of urgency when it comes to AI in the classroom.
âWe have to make sure our students are prepared for the workforce,â she says, âthat they understand what business, or medicine, or education, or social science is doing with AI.â
Worries about students using AI to plagiarize âshould just be taken off the table,â she adds. âAI is here to stay. As educators, perhaps we need to think about how we use writing as a catchall assignment to demonstrate knowledge. Maybe we could use alternative methods.â
Kobe Phillips, a senior majoring in ecology and evolutionary biology and a member of USFâs Judy Genshaft Honors College, uses an AI design tool in one of his classes â something that is encouraged by his professor. Phillips believes AI is a âphenomenal resourceâ for students and faculty.
âThere is so much potential in this space, be it for helping students to create study guides or create code or new solutions when they couldnât think of one, or even for professors for generating questions,â Phillips says. âAs students, we want to put our best foot forward. We are here to learn and these new resources could transform our ability to learn.â
IS IT ART WITHOUT HEART?
Heather Sellers is an accomplished poet. An award-winning author of books, short stories and essays.
And a disrupter.
âWriters and artists are welcoming of disruption,â says Sellers, director of USFâs creative writing program in the College of Arts and Sciences. âArtificial intelligence is a great disruption. I think we see our own role in society as disrupters, to ask questions and push out of the way how things have been done in order to move things forward in a new way.â
Sellers, who has taught at the college level for 30 years, including the past 10 at USF, says that at this point, AI cannot write a beautiful poem or create a novel with depth and meaning. But that day may be coming.
If it does, âItâs going to completely change our understanding of the human experience,â she says. âWhat I think is so important about the humanities being at an inflection point like this is our engagement with the ancient questions that have always governed our discipline: What is it to be human? What is it to feel? What is it that is important and to be cherished in the human experience that needs to be fed into AI and fed back to us so thereâs a synergy in the relationship? Itâs going to continue to evolve, but I donât find that threatening.â
As she has experimented with generative AI tools, she has found they canât do what poets do.
âAI is not able to render the complexity and depth of the human experience and put those into language thatâs beautiful and meaningful,â Sellers says. âWhatâs exciting in the classroom is to be able to show students, when you ask AI to write a poem in the style of Robert Frost, why it isnât able to do that. There are a lot of aspects of meaning-making and language and the human experience that are beyond its capabilities.â
It can be helpful with formulaic writing like letters of recommendation and program reviews, she notes. But while poetry and great novels use form, they are not formulaic.
AI stretches the imagination and can be a great collaborator, says McArthur Freeman II, an associate professor of animation and digital modeling in the School of Art and Art History.
He often employs technology for films and games in his creative efforts. Heâs also a sculptor, which starts with digital models, and a painter.
âWith AI, the question has to do with what we bring to the table,â he says. âAI has no feelings, itâs not invested, it has no desire. One type of AI can render images in very exquisite and beautiful ways with lighting and texture, but what it doesnât do is generate ideas and a direction. Thatâs what artists bring, their perspective.â
AI has been shown to reflect bias, and that concerns Freeman. He recalls the experience he and his wife had when they asked a generative AI tool to produce images of 100 physicians.
âThey were all white males and almost all of them had gray hair,â Freeman says. âItâs easy to look at it and think, âItâs a computer, it doesnât have bias.â All of these programs are trained off of the data that someone inputs.â
Sellers agrees thatâs a problem, but it also creates an opportunity for her as a teacher.
âFrom an educational standpoint, itâs very exciting to feed AI a prompt that you know is going to generate something thatâs blatantly biased,â she says.
She asked a generative AI program to translate ânon-binaryâ into Spanish.
âIt will say âno binario,ââ the masculine form, she says. âIf you ask it what that means, it says that depends on the gender of the person if itâs a male or a female. It just doesnât know.
âBut thatâs one of the things students come to a university for, to learn critical thinking, to be able to slow down and assess. I donât think itâs any different than the way weâve been asking students to always consider the source since they were in fourth grade. Itâs a source and it has incredible weaknesses. Bias is baked into everything, and this is a great opportunity to bring awareness to bias and authorship and ownership.â
Noting fears of widespread use of AI by students to cheat, Freeman says, âWe have a responsibility to prepare students to adapt and compete in a world that will be utilizing AI after they graduate. Rather than focusing solely on restricting its use, we need to find new strategies to employ AI as part of the learning process.â
New technologies can be useful tools and facilitate new ways of thinking, he says.
âThey prompt us to ask questions like, âHow can AI enrich our understanding of the creative process and our role in it? What can it enable us to learn that was less accessible before? What do we need to strengthen in our own education to better leverage andcollaborate with these tools?ââ
AI, Freeman says, âcan reveal things that we havenât been able to readily perceive, which allows us the opportunity to learn to see new things.â
IN SICKNESS AND IN HEALTH, A HELPFUL TOOL
Usha Menon recalls a moment early in her career while working with a more experienced nurse in a hospital maternity ward. Her mentor looked down at a tiny patient and told Menon that something was about to happen with this baby.
âI said, âHow do you know? That baby looks perfectly fine,ââ Menon says. âAnd she said, âWell, the hair on the back of my neck is standing up.âââ
Years later as a cardiac care nurse, Menon would sometimes find herself hovering around certain patientsâ rooms, braced for a crisis. âI couldnât quite say why. It was just a feeling.â
That is what generative artificial intelligence cannot mimic or replace.
âWhen we think about machine capabilities, how do you program for that extensive experience and the gut feeling that humans bring to a situation?â asks Menon, dean of the USF College of Nursing and senior associate vice president of USF Health. âI donât think you can.â
While Menon and other medical professionals say AI has its limits, they agree itâs already providing benefits. As demand for health care overwhelms the supply of providers, they welcome AI taking over administrative chores and other tasks that donât require their expertise.
âWe have an aging and sicker population, particularly here in the Tampa Bay region. We have 1,000-plus people a day moving into the state, and our ability to deliver care is increasingly stretched thin,â says Dr. Nishit Patel, MD â10, a professor in the USF Health Morsani College of Medicine, vice president of medical informatics for USF Tampa General Physicians, and vice president and chief medical informatics officer at Tampa General Hospital. âWe have to figure out, how do you serve those additional needs with the same or less of a workforce?â
The COVID pandemic accelerated the development and use of tools like USF Healthâs patient portal, MyChart, which streamlines communications, freeing providers for more hands-on care. Patients use MyChart to schedule appointments, message with providers and request prescription refills, all from home.
âWe went from about 330,000 patient portal messages pre-pandemic, in 2019, to almost 900,000 last year,â Patel says. âYou look at tools like generative AI because the promise of whatâs there is to solve a problem that seemed impossible just a couple of years ago.â
Imagine AI sorting through patientsâ medical records.
âWe have tons of valuable information captured in your electronic health record, but many of those valuable insights are hidden away from physicians because of the sheer volume,â Patel says. âGenerative AI has the potential to scrape the entirety of a patientâs chart and provide me with a high-yield summary of all of the key events and results since I last saw the patient.â
It doesnât replace providersâ decision-making, Patel says, rather, it allows them to make better decisions more quickly.
For example, at Tampa General Hospital, surgical patients get speedier access to specialized nursing care thanks to AI.
âWhen a patient is coming in for surgery, we have to think about what happens after the procedure, including how long they will need to be in a post-anesthesia care unit (PACU), if they will need to stay in the hospital afterwards and which type of specialized unit they need to be placed into for that type of surgery,â Patel says. âBed planning and capacity planning are incredibly complex activities that have historically required a lot of time and generally occurred the morning of the procedure.â
TGH has developed predictive models that have shortened PACU hold times by 28%, reduced bed planning time by 83% and shifted bed planning to days before the procedure. They have a 95% accuracy rate.
âWhat this means for patients is that we have made all the necessary planning for their successful recovery before they even step foot into the operating room,â Patel says.
But will the day come when algorithms make medical decisions?
âThe fundamental practice of medicine remains the same,â Patel says. âMedical decision-making still occurs at the cross section of data, experience and training, and that does not change with AI.â
In psychiatry, Dr. Ryan Wagoner, MBA â22, has not seen widespread adoption of AI beyond patient scheduling. But he sees its potential. An associate professor and division chief of the Morsani College of Medicine Department of Psychiatry and Behavioral Neurosciences, Wagoner says AI may one day offer limited help in treating mental illness.
Primary care doctors, who treat most âstraightforwardâ issues such as depression and anxiety disorders, might find AIâs algorithms useful in prescribing medications, Wagoner says. But for more complicated problem, patients will still need people.
âEspecially in psychiatry, very often people do not want to tell a computer âHereâs why Iâm feeling so awful about somethingâ or âHereâs this unusual experience that Iâm having,ââ Wagoner says. âThey want a human being to be able to relate to and provide some empathy to understand that shared human experience.â
If someone in a fragile emotional state seeks help from a chatbot and it responds inappropriately, what will be the impact? he asks. Mental-health professionals might also say the wrong thing, but they can read patientsâ cues and switch gears.
âThere are some stops in there that humans have whenever they see another individualâs emotional state headed in a certain way,â Wagoner says. âA chatbot wonât have that.â
AI can be an amazing tool with great potential to assist in health care, he says.
âBut Iâm not looking for AI to replace what I do anytime soon.â
MORE WINNERS THAN LOSERS IN THE WORKFORCE
Is generative AI coming for your job?
Maybe. Maybe not.
The past yearâs rapid growth of such tools as ChatGPT, Google Bard, Bing Chat and Dall-E2 has led to widespread speculation about which workers they may one day replace â and which new career options they may create.
Distinguished University Professor Sudeep Sarkar describes generative AI as âa computing technology that is going to unleash the human potential. It is a tool thatâs going to accelerate innovation and creativity.â
Sarkar chairs USFâs Department of Computer Science and Engineering and is co-director of the USF Institute for Artificial Intelligence + X. The âX,â he explains, can apply to a wide variety of disciplines â business, biology, finance, public health, for example. Sarkar also is a member of the universityâs strategic planning group thatâs developing guidelines for the use of AI in teaching, learning and USF operations.
The technology has remarkable potential to transform work. But, Sarkar says, âI havenât seen companies saying, âWe are going to get rid of this job because generative AI can do it.â
âWhatâs going to happen is that the nature of some jobs will morph,â he adds. âSome jobs will become larger in scope, while for other jobs, the nature of the work will shift.â
USF Innovative Education, working with faculty in the College of Engineering, has created an Artificial Intelligence Certificate program. It is fully online, catering to the needs of working adults seeking to upskill or reskill to advance their careers. Students learn how to design and deploy AI for real-world applications.
Sidney Fernandes, vice president of information technology and chief information officer at USF, says that when it comes to the impact of AI on jobs, âThere is no one-size-fits-all answer.â
There may be new opportunities in AI research and development, as well as positions focused on compliance security and ethical and responsible AI implementation. He also noted the executive order issued by President Joe Biden in late October regarding AI regulation and the need for transparency, suggesting the need for new skill sets in legal compliance and information technology.
Existing jobs that might be dramatically affected by AI include those involving repetitive tasks or processes that can be automated. Examples include data entry, some aspects of customer service, as well as entry-level white-collar jobs from technology, to legal, to human resources and health care.
âIn all of these cases, the jobs will not go away,â Fernandes says. âRather, AI will be something of a tool for creating more efficiencies and better outcomes, as long as the users of the tools have a firm understanding of the limitations.
âThere will be the need for a human to review, make judgments and ensure that we do not ever trust the AI answers.â