Settings

Theme

ChatGPT Edu

openai.com

45 points by Lealen 2 years ago · 49 comments

Reader

newswasboring 2 years ago

Sad. I was expecting something actually to do with education. This is just the premium service with more surveillance tools for the school.

mac-chaffee 2 years ago

Some of the use-cases mentioned would mean FERPA applies: https://studentprivacy.ed.gov/faq/i-want-use-online-tool-or-...

Surprised FERPA wasn't mentioned explicitly. At least this version doesn't use the data for training, but I shudder to think of all the college administrators dumping student information into their personal ChatGPT accounts right now...

  • ModernMech 2 years ago

    Seemingly, edu solutions from big tech are developed without ever asking input from actual educators. They usually come from a "I went to school, so I know what educators need" perspective, which is often wrong. The offerings are typically half baked, buggy, and deaf to the needs of educators. Github classroom comes to mind. ChatGPT is apparently the next example, where none of the features are educator focused, but instead are focused on bringing their products to students.

    Educators have to be especially wary of these efforts, because every tech company comes up with the bright idea of "let's give it to students for free, they'll get hooked, and then when they make money they will buy our product." So we are inundated with spam messages for edutech the way doctors are spammed by big pharma reps. I've got messages in my inbox right now offering me $$$$ to blog about some AI startup. I've got some tech rep hounding me to push some online C++ tool. Now add OpenAI to the list.

    • aleph_minus_one 2 years ago

      > They usually come from a "I went to school, so I know what educators need" perspective, which is often wrong.

      Change it to "I knew that educators would need if they were like me", and it (surprisingly) gets mostly correct. For example in the past, I volunteered to prepare talented pupils in math for studying this subject at a university, so I do claim that I have some experience in education.

      I think I know some things about what would make sense for educating children, but the kind of people who actually become educators are a quite different kind of breed than me. So educators would likely not like my ideas (they don't fit the political climate and/or desired style of education), even though I think they do make sense (and would dogfood them). The latter is evidenced by the fact that I was told by work colleagues that they would love my ideas to be put into a textbook form. Thus, my ideas seem to fall on much fertile ground for an audience of gifted parents who would love for their children to become gifted, too, than for people who become educators.

    • nunez 2 years ago

      Agreed. I'm biased because my wife is an educator, but so so SO many edtech companies would benefit from hiring former teachers into business development/product roles instead of only hiring Ed.D/Ph.Ds for those roles, many of whom have limited field experience.

      That said, maybe OpenAI did this?

      • ModernMech 2 years ago

        Maybe they did, I'm eternally optimistic. I'd have more confidence if OpenAI had used educator voices to say something that resonated with me. Instead the only quote they provide is marketing gobbledygook from some c-suite admin.

        Why couldn't OpenAI get a quote from some professor using their OpenAI offering in their classroom, saying how it benefits their students. All I learned from Kyle is that he's using ChatGTP to "harness", "collaborate", "transform", and "integrate"... completely meaningless to me.

        Makes you realize this website is not targeted to appeal to educators, but University CIOs.

        Actually, for the heck of it I asked ChatGPT to write me marketing copy for a website for ChatGPT Edu, and it did a better job centering educators and students compared to what OpenAI released here: https://chatgpt.com/share/bd581724-273c-40d5-aa55-0cd0e4d88a...

cowmix 2 years ago

Arizona State University was an early (first?) adopter in providing broad access to ChatGPT Enterprise, but the implementation has not been as smooth as I hoped. My wife works at ASU, and from our discussions, it’s clear that the rollout has been somewhat disorganized.

The promotional announcements led me to expect "ChatGPT 4 for all!" Yet, the distribution of access has been ambiguous, leaving many staff members, including my wife, in the dark. There has been little to no guidance on how to effectively integrate ChatGTP into their daily tasks, appropriate use cases, or any potential costs involved.

The gap between the initial excitement generated by press releases and the reality of the rollout's execution has been significant. Even now, the specifics remain unclear, despite my efforts to assist my wife in navigating this whole thing.

  • nunez 2 years ago

    That much was obvious from ASU's testimonial on the page. This reads more as resume-driven development than "this product is actually solving problems for us"

    > “Integrating OpenAI's technology into our educational and operational frameworks accelerates transformation at ASU. We're collaborating across our community to harness these tools, extending our learnings as a scalable model for other institutions.”

elicksaur 2 years ago

LexisNexis, Westlaw, and Bloomberg all use the same tactic with law schools. (Legal research search engines)

Provide free usage to students, so they get used to using it indiscriminately, then they charge on a usage basis for non-students depending on the plan.

  • btbuildem 2 years ago

    Lots of corps do this (not saying it's great). Getting students familiar with the tools you make is one of the better ways of ensuring they'll want to use them at work. Some companies go as far as worming their way into curriculae, and having their software packages be officially part of the coursework.

  • pvtmert 2 years ago

    microsoft is doing the same for tens of years now

    especially in 3rd world countries, they even provide free materials and education... which is high ROI down the line

Bostonian 2 years ago

The elephant in the room is that many students are using ChatGPT to write their papers. Since they will use similar tools at work to create presentations, I'm wonder how much colleges should try to police use of ChatGPT by students.

  • i_am_proteus 2 years ago

    In a class of any substantial size, it's very obvious who is using ChatGPT and who is thinking for themselves.

    A large fraction (often 70%) of responses will be identical in structure and content, if not verbatim copies of each other. These are the people who have used ChatGPT. These are the people who tend to do very poorly on closed-laptop examinations.

    Students have a choice of how much of their critical thinking they choose to offload to the computer, and how much they develop their own critical thinking skills.

    Robust assessments in classes still paint an accurate picture of students' capabilities. Designing those assessments requires work for instructors. Pour one out for the lazy.

    I suspect that in the coming years, industries will reward those who are capable of adding value beyond naïve parroting of LLM output. Pour one out for the lazy.

    • flir 2 years ago

      > who is using ChatGPT and who is thinking for themselves.

      You can do both. My son's using the chatbots as idea/synonym/high-level structure generators and general tidier-uppers. I'd be surprised if he's taking more than 10% of their suggestions verbatim, but it's great for rubber ducking.

      • throwaway48476 2 years ago

        I've found chatGPT to be great for writers block as it's very good at giving me examples of what not to write.

    • NemoNobody 2 years ago

      There will be a few that can write better prompts than others - based on your confidence I'm quite certain professors all across the world have already highly graded AI papers, while worrying about that very problem coming and failing others for it.

      • aaplok 2 years ago

        My experience has been that performance has gone down significantly since students have access to genAI. I probably get a fifth of the number of high performers that I was getting before. So I'm not worried about the hypothetical tiny minority who can write good prompts. I am worried about the larger group who could have done good work but ended relying on chatGPT too much.

      • i_am_proteus 2 years ago

        The real question is whether to meaningfully "grade" papers that can be written with the aid of a computer at all.

        Alternative systems (which have existed before LLMs) include at-home work being required to earn a seat at the exam, but marked for feedback only, with the exam being the assessment graded for credit.

    • yousif_123123 2 years ago

      At the same time using chatgpt or any LLM productively, and getting the most out of it to help you think and reason better is a skill on its own that requires development.

      • NemoNobody 2 years ago

        You mean like teaching how to get Google to solve my math problems for me, instead of teaching me the logic behind the problems - what use do I have for theoretical calculus?

        The only time in my life I've been unable to complete something mathematically, I was trying to compute accumulation of ROI for a shared money pool for crypto trading - I'm not even saying that correctly.

        I couldn't enter the formula for the math I needed I to Google as I did not kno how to, have not owned a TI- anything calculator since highschool and completely forgot even the name for the genre of math that I was doing.

        I kno that Google could do exactly what I needed tho and that was frustrating having to look literally every step of it all up

        I think AI could have eliminated all of this.

        • yousif_123123 2 years ago

          Yes it could be like learning how to use google to get information, or to learn the logic to solve something. I think the main under hyped aspect of AI currently is how useful it can be integrated with whole bunch of regular programming code and UI/UX to implement parts of functionality that can't be programmed explicitly. Like even if gpt4 is the last AI model, if it keeps getting cheaper and faster, the amount of cool features that can be with with some code and a few API calls in parallel or in sequence will add a lot of nice features to all the apps people interact with everyday.

          Like my Dad uses multiple apps to get the information he wants for all the sports games that happen during the week, and no app is perfect. If he could prompt something to make a UI with the relevant info he wants displayed like he wants, he would be so much happier. It's little things like that that will make software way more customized and useful for people.

          I think there's a lot of hype on AGI and they'll be more, but just as another tool in the programmer's toolbox, LLMs will allow better product to be developed (if people learn how to prompt and think with LLMs in mind vs dismissing all of it when they see a mistake)..

      • monkaiju 2 years ago

        Citation needed

  • eganist 2 years ago

    Wouldn't surprise me if schools go the route of forcing a cloud toolset (e.g Office 365 or Google Workspace) and measuring proof of work - in a very literal sense, not a cryptographic sense.

    Ripe for being gamed but is probably the only way to do PoW without becoming wildly intrusive with personal devices.

    • rahimnathwani 2 years ago

        measuring proof of work
      
      This is one of the ideas Sal Khan mentions in his recent book: interaction history with Khanmigo.

      But there's nothing to stop you from using another AI system to interact with Khanmigo.

    • eviks 2 years ago

      Or you could switch to proof of knowledge and use that old boring tech of a conversation. Also harder to game at such a low latency.

dougb5 2 years ago

"Professor Nabila El-Bassel at Columbia University is leading an initiative to integrate AI into community-based strategies to reduce overdose fatalities. Her team built a GPT that analyzes and synthesizes large datasets to inform interventions, reducing weeks of research work into seconds."

That sounds impressive but the link they give to this research just describes a (huge) NIH grant, and speculates on all the ways that AI can help allocate opioid treatment. Is there a paper with actual results from using the described GPT?

wcchandler 2 years ago

Fun. I guess I’ll have this project on my to-do list when I get back from vacation. We already have a managed GPT offering from Azure so it’ll be interesting to see where these intersect.

cbolat 2 years ago

OpenAI is working the exact same way drug dealers working... give it away to students for free at parties, make them get addicted to it.

  • kkyr 2 years ago

    So just like any other piece of software that offers a free student license?

  • eli 2 years ago

    The is for university administrators, and I don’t think that’s how drug dealing works.

  • mateus1 2 years ago

    Google offered my Uni free unlimited storage forever a few years ago.

    Students are now scrambling to migrate, turns out forever at Google is a lot shorter than you think.

  • throwaway115 2 years ago

    Since we're making silly comparisons, charitable organizations also give things away for "free", making the recipients somewhat dependent on them. There's plenty to criticize OpenAI about, but why associate them with drug pushers?

  • ffhhj 2 years ago

    There are only two industries that call their customers ‘users’.

  • HeatrayEnjoyer 2 years ago

    Where do I go to get these free drugs?

hubraumhugo 2 years ago

I remember the fascination of reading through physical encyclopedias when I was a kid.

Having the collective knowledge of the world in my pocket and a personal tutor for any topic still sounds surreal.

  • gmuslera 2 years ago

    It is different than a passive encyclopedia. For good and for bad, it goes right to the core to what you want to know. You can jump to the conclusion of a paper instead of going all through it, or get just what you need for the problem or question you want to solve. Some of the inefficiencies of the old model had their side benefits.

    But having a tutor or teacher to actually interact with have also important benefits. It is not a lesson for a class of who knows how much students, but for you and you in particular. You may be missing some other student questions, but to have freedom to ask, without social pressure, without worrying about time, or even for old topics that may or not be related to the current one, may be a big improvement.

    The next question we should ask ourselves is what would be now the role of libraries, teachers and teaching institutions in this landscape. What are the cracks in this new teaching model that they can adapt to to fulfill?

  • flir 2 years ago

    IMO you can only really trust it on subjects with tight "REPL loops" (for want of a better term). Subjects where you can test its output against reality immediately.

    If you're discussing, say, philosophy with it, it has the potential to lead you deep into the weeds.

  • harperlee 2 years ago

    Had a similar feeeling with a Grolier 95 CD (one of the first to enter my home at the time); it seemed infinite.

  • aleph_minus_one 2 years ago

    > Having the collective knowledge of the world in my pocket and a personal tutor for any topic still sounds surreal.

    Basically for every topic about which I ask this "personal tutor" (AI models), it hallucinates up stuff. My work colleagues tell me that I should ask it about more "normal" topics, and ask for less paper references. Seriously: for more "normal" topics or for topics where quite some false answers are acceptable, I don't need a personal tutor.

_the_inflator 2 years ago

Smart marketing move. Society’s influential people are coming from universities.

Many white collar jobs require a degree, and people from institutions like schools will get trained on tools they later use during university and work.

These people will influence others then as-well.

Reminds me of Steve Jobs push to sell computers to educational institutions.

micw 2 years ago

IMO, if universities want to use "AI", they should use their resources to implement their own, open language models rather then implementing a commercial one.

  • aleph_minus_one 2 years ago

    IMO, if universities want to use textbooks, they should use their resources to write their own, open-access ones rather then using commercial ones.

    SCNR

    • micw 2 years ago

      Absolutely. Same applies for schools.

      In particular, not every school/university should do it by it's own. But they all together should do so (e.g. by funding the creation of open content).

      That would allow to give out books (or copy content from it) for free, make it accessible to more students.

    • ImPostingOnHN 2 years ago

      They often do so, good idea.

    • spookie 2 years ago

      They often do.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection