Northeastern U. Student Demands Tuition Refund After Catching Professor Using ChatGPT
Northeastern ultimately decided to reject Ella Stapleton’s claim.

Artificial intelligence is a tool.
However, it can also be a crutch. One student is not happy that she is paying for education from an instructor who relies too much on AI, especially when the instructor prohibits students from using it.
A senior student at Northeastern University, Ella Stapleton, demanded a tuition refund of over $8,000 after she determined that her business professor, Rick Arrowood, had covertly used AI tools (e.g., OpenAI’s ChatGPT, Perplexity AI, and the presentation generator Gamma) to generate lecture notes and course materials.
This was particularly contentious because the course syllabus explicitly prohibited students from using AI tools for their assignments, a rule the professor himself was not following.
In February, Ella Stapleton, then a senior at Northeastern University, was reviewing lecture notes from her organizational behavior class when she noticed something odd. Was that a query to ChatGPT from her professor?
Halfway through the document, which her business professor had made for a lesson on models of leadership, was an instruction to ChatGPT to “expand on all areas. Be more detailed and specific.” It was followed by a list of positive and negative leadership traits, each with a prosaic definition and a bullet-pointed example.
…Ms. Stapleton decided to do some digging. She reviewed her professor’s slide presentations and discovered other telltale signs of A.I.: distorted text, photos of office workers with extraneous body parts and egregious misspellings.
She was not happy. Given the school’s cost and reputation, she expected a top-tier education. This course was required for her business minor; its syllabus forbade “academically dishonest activities,” including the unauthorized use of artificial intelligence or chatbots.
It turns out he used a wide range of AI tools:
The professor in question, Rick Arrowood, later admitted to using a trio of AI tools — ChatGPT, the Perplexity AI search engine, and Gamma, an AI-based presentation maker—to prepare course materials. While not illegal, this use of AI triggered questions of transparency and academic integrity, particularly when the professor had discouraged students from using similar tools for their own assignments.
““He’s telling us not to use it, and then he’s using it himself,” Stapleton pointed out, branding the hypocrisy as unacceptable in a university of Northeastern’s standing.
Interestingly, a new study from another prestigious institution (Duke) finds that people both anticipate and experience judgment from their colleagues when using AI at work.
The study involved more than 4,400 people who, through a series of four experiments, indicated ample “evidence of a social evaluation penalty for using AI.”
“Our findings reveal a dilemma for people considering adopting AI tools,” the researchers wrote. “Although AI can enhance productivity, its use carries social costs.”
However, a reduced prestige rating does not stop higher education professionals from using the new tool.
Stapleton’s situation highlights the growing use of AI in higher education. A survey conducted by consulting group Tyton Partners in 2023 found that 22% of higher-education teachers said they frequently utilized generative AI. The same survey conducted in 2024 found that the percentage had nearly doubled to close to 40% of instructors within the span of a year.
Meanwhile, Northeastern University has rejected her claim.
Stapleton lodged a formal complaint with Northeastern’s business school over the incident, focused on her professor’s undisclosed use of AI alongside broader concerns about his teaching approach—and demanded a tuition refund for that course. The claim amounted to just over $8,000.
After a series of meetings, Northeastern ultimately decided to reject the senior’s claim.
I remember starting my scientific studies with a teacher instructing us on the use of a slide rule.
I recall trying to learn to use a sliderule when beginning my studies in science!!!! pic.twitter.com/2qRqz3ypIf
— Leslie Eastman ☥ (@Mutnodjmet) May 16, 2025
I also remember the joy of getting my first calculator to do chemistry homework.
There is a way to properly and honestly use tools. It’s just a matter of academia establishing a proper equilibrium on how to do research and and teach critical thinking skills.

Donations tax deductible
to the full extent allowed by law.
Comments
Replacement of educators with AI chatbot is already happening with courses at junior colleges, and online colleges. But that is for courses where the student receives a “certificate of completion” not a diploma.
I’ve got to go with the student on this one. She’s paying for, and expecting to take advantage of, the instructor’s experience and expertise. I wonder if there may be a breach-of-contract situation here.
Count me out. It’s a ridiculous complaint.
“This was particularly contentious because the course syllabus explicitly prohibited students from using AI tools for their assignments, a rule the professor himself was not following.”
I can’t count the number of courses I’ve taken where the teacher’s copy of the textbook had a big section at the end with all the exercise answers, but ours didn’t… and if we were caught with one, we’d be in hot water.
I should have sued back then, I could have retired sooner.
The students are there to learn stuff. The professor isn’t.
My favorite example was my high-school geometry, which was taught by a coach fulfilling a mandatory instructional quota. The man was obviously no scholar. But when students couldn’t figure out how to get from point A to point B in a problem, he was always able to explain the reasoning. And that’s the true value of the teacher.
The professor’s experience and expertise is being utilized. If anything he presented from AI was factually incorrect or in an improper context, then and only then would she have a case. He would then be guilty of what students do when they submit AI generated work as their own intellectual efforts. On the professor side, it is much different. He probably fed the information to the AI and had it organize it in a simple and digestible manner. Then, if he did his due diligence, the generated outputs have to be validated.
Students have little expertise and are underqualified to vet AI. They also notoriously generate bad prompts, especially when cheating under a time crunch. This leads to AI “hallucinations” like critiquing a work of literature that doesn’t exist or citing caselaw that has been rebutted or came from a different country as if it were US legal precedent.
I have no problem with the instructor using it as long as he’s verifying what he gets from AI is correct. I think it will allow better presentations and more study material. As for the students, no. They are there to learn not feed questions into an AI. When you hit the real world with a degree that you earned then use AI as needed. It’s a just a tool like a graphing calculator. I use it for proposals, SOPs, troubleshooting random Windows errors. I also have one that has my companies advanced software manuals. When I find the root cause of the issue I update the case with the fix for the problem and the rest of the support dept can now do their jobs faster and get our customers up and running.
Captures my thoughts exactly.
Don’t know what idiot downclicked you.
” I recall trying to learn to use a sliderule when beginning my studies in science!!!!”
I taught myself to use the slide rule at age 14 which served me well when I got to engineering school. We were expected to be proficient with sliderules before enrollment. I also used sliderules on the job for many years until the HP-35 became popular in 1974. I think students should be kept away from calculators until perhaps the senior year in high school. With the sliderule one must know the order of magnitude for your calculation. The sliderule merely refines mental calculation. The physicist John A. Wheeler (Richard Feyman’s advisor) went so far as to recommend that physicists should have a good estimate before running a computer calculation.
Electronic calculators have retarded student’s ability to do mathematical calculations. In many ways they have become a menace. Now AI has made everything much worse. I gave ChatGPT a problem over a year ago and it completely flubbed it even with my help. I asked: calculate the determinant of a 15th order Hilbert Matrix. I kept helping it, and it still couldn’t do it. On the other hand, Musk’s Grok 3 did it splendidly. I was impressed. BTW Excel can’t even do order 6. Excel is junk. Don’t use it except for balancing your checkbook. NIST ran a series of tests years ago and Excel was at or near the bottom. Mathematica did every problem correctly. Real men use Mathematica (now called Wolfram).
I was proficient with a linear slide rule, but found it frustrating as answers went off the end of the rule,
I found and purchased a circular slide rule which was a joy to use.
I still have them to this day.
I still have my dad’s slide rule. He used to have a circular one as well, but I don’t know what became of that. He showed me the basics of using a slide rule, but when I was in high school the TI-30 came out, which was amazing at the time. I was just happy I didn’t have to interpolate the trig tables anymore.
Learned slide rule in HS. Eventually got a TI50 in college. The HPs were too expensive for me.
Mathematica is great. I remember using it during its early development, when it was called Macsyma. Once or twice I could do an integral that it couldn’t, but it was great even then.
Those days are long past. It’s very helpful as an assistant for doing complicated algebraic calculations.
LLM’s make lots of math errors. But all they have to do (and some are surely doing this) is have the LLM hand off any calculations to Mathematica. It can be done behind the scenes, then the LLM could insert “explanations” for the steps. At least the steps will be right.
The Feeling of Power.
Shut off electricity, steam, water and HVAC.
Let smarty pants figure out how to operate boilers, chillers, generators and compressors.
I’d love to see their facilities burn to the ground because they had no idea the jockey pump needs to run constantly to keep the fire suppresion header charged to 90 psi.
The genie can’t go back in the bottle. The education system needs to be completely revamped. We can’t survive on braindead professors that let the computer generate their lessons for them.
AI Chat etc
Still will be better than a dei hire
This seems to be an overreach to me.
She has a legitimate claim as to this one class. Refund her the money for those credit hours as she was paying for a professor to teach a subject and not AI to teach the subject.
A quick look at Northwestern’s site gives $563 per credit hour. (MUCH more when dealing with graduate schools.)
It is hard to get to her $8000 demand with those numbers for one class.
One has to wonder if her claim was rejected simply because the school looked into how many professors were doing the same thing and found a massive problem. They did not want to set a precedence on this issue.
Northeastern University tuition cost is substantially larger than Northwestern University – $2,000+ per semester hour or $32,000+ per semester for a full time student.
I guess the little darling will next complain that the professor has a text book with all of the answers in the back, and she feels she’s at a disadvantage because she can’t use the textbook with answers.
I learned two things during undergraduate and graduate school.
1. I spent more time teaching myself more than the professor ever could in a 1 to 1.5 hour course twice a week.
2. Cooperate and graduate. Most of my very liberal professors wanted a certain slant to every paper. I learned that fast in a PolSci class where the paper would answer the question “Is the Constitution an Economic or a Social Document?” Compare and Contrast.
To him, the Constitution was a Social document that hindered minorities, was a living document that could be interpreted based on the circumstances of the day, and that was written by old white men who wanted to keep women barefoot, pregnant, and in the kitchen.
I withdrew from a PoliSci major to Finance and Accounting after that class.
She’s not really paying for an education. She can get that in tons of places for far cheaper – and even free if she has some drive and is halfway independent. What she’s paying for is the piece of paper.
This is, of course, a stupid argument. There is no parity between a teacher and a student. A teacher can look at his notes while lecturing but students can be barred from looking at notes while taking a test, just to point out one simple example.
Frankly, the student making this argument proves that she’s too stupid to even be in college.
The professor organized the class, and stood in front of their darling faces twice a week for 13 weeks teaching and answering questions. Is that labor free?
If the instructor rolled out a robot to deliver his lectures, the student might have a point. But to complain that the instructor was relying “too much” on AI is silly and vague. Will said student also complain if the instructor researches on the internet instead of hanging out in the stacks and using the library’s card catalog? Is she going to complain if he uses electric lights and a computer to prepare lecture notes rather than scratch them out on birchbark by candlelight or a kerosene lamp?
As for the supposed double standard, he’s already gone to school and paid his dues. Once she graduates she can do what she wants.
Nothingburger. The professor was not using AI to claim credit for someone else’s work product. He’s using it to enhance his presentation to the students on the work they will need to do. Student thinks she has a gotcha but it’s really not.