The Daily Tar Heel
Printing news. Raising hell. Since 1893.
Wednesday, Feb. 28, 2024 Newsletters Latest print issue

We keep you informed.

Help us keep going. Donate Today.
The Daily Tar Heel

UNC faculty and staff gain access to AI generative tool Microsoft Copilot


As of this month, UNC faculty and staff can access a digital artificial intelligence tool to summarize articles, generate coursework and accelerate their online research. 

Information Technology Services made a version of Microsoft Copilotformerly known as Bing Chat Enterprise — available to UNC employees on Nov. 8On Nov. 15, Microsoft announced that Bing Chat and Bing Chat Enterprise became Copilot.

While ITS still advises University employees to use caution when sharing information with any chatbot or AI tool, the University’s version of Microsoft Copilot is advertisement free and does not store or view users’ chats

“The absolute most important, main thing is that we’re providing access to the faculty and staff in a way that gives them a partition — that is, the institutional partition — that has some protections beyond just using a commercially-available free tool,” said Michael Barker, vice chancellor of ITS.

Unlike OpenAI's popular generative chatbot ChatGPT, Microsoft Copilot can connect to apps used by the employee (such as Word, Excel and PowerPoint) and generate images based on text prompts.

Stan Ahalt, dean of the UNC School of Data Science and Society, said Microsoft Copilot is also more “grounded” than ChatGPT and it uses available, reliable data to constrain its responses and avoid factual errors.

Answers from Microsoft Copilot come with cited sources from the internet, which Barker said accompany a brief summary of the information it can find online to answer a user’s question. This method of presenting information makes internet research quicker and makes results more concise, though some users have found that factual errors appear in the summary component of some of these responses. 

“The chat can produce hallucinations, can provide inaccuracies and can expose biases that are the content of what’s on the web and what it’s been trained on,” Barker said. “These are all improving over time, but those are some of the weaknesses, at least at present.”

A chatbot hallucination is a phenomenon where it "perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate," according to IBM. The chatbot is, in essence, making something up. 

Research by scholars at Stanford University and the Massachusetts Institute of Technology found that artificial intelligence is increasing worker productivity by an average of 14 percent in the workplace. Some UNC faculty members, like business professor Mark McNeilly and journalism professor Steven King, are already using generative AI in their coursework

Ahalt said that many UNC graduates will be “called on” to use AI in the workplace. 

“I expect that lots of people will end up wanting to use it for their day-to-day activities, particularly for some of the more mundane things,” Ahalt said. “I think that a number of people have been experimenting with it in the classroom and trying to understand its impact, and it certainly will impact almost every disciplinary area.

Senior Lily Friedman, who follows developments in machine learning, said she is worried about the future of online research in undergraduate programs. 

“There are a lot of examples, already, of large language models getting things very incorrect and people having no idea,” she said.

Friedman said that large language models like Microsoft Copilot are trained on large “dumps” from the internet, often without the consent of the people whose work is used, which can sometimes result in plagiarism, especially in the field of AI-generated artwork. A large language model (LLM) is a type of AI that can recognize and generate text, among other tasks.

On the other hand, she said the technology can be used to streamline certain tasks.

"LLMs can make the work of UNC employees more efficient, but Microsoft Copilot is still capable of producing false information and inaccurate results," Ahalt said.

He added that users should never separate their search results from the context in which they plan to use them.

“We live in a world where information is abundant, and not all of it is useful, and some of it is downright awful,” Ahalt said. “Thinking about that critically, all the time, is an important part of being a good citizen.”

University Libraries established the Carolina AI Literacy initiative in June to prepare students and employees to use the technology responsibly. Three free modules about prompting, fact-checking and documenting sources while using AI are available on the CAIL website.  

Barker said employees who use the secure, institutionally-scaled version of Microsoft Copilot will learn its strengths and weaknesses, discovering through hands-on experience how to achieve good results using the technology. 

To get the day's news and headlines in your inbox each morning, sign up for our email newsletters.

“Engaging with it is the necessary first step to getting the answers to those additional questions about how best to use it,” he said. “You’ve got to engage with it in order to understand it.”