Settings

Theme

White House says no need to restrict 'open-source' AI at least for now

wdtn.com

61 points by plinkplonk a year ago · 64 comments

Reader

OldGreenYodaGPT a year ago

Regulating open-source AI will only help giants like Google and OpenAI, stifling innovation. It creates barriers only they can afford, limiting competition and diversity. Open-source fosters transparency and rapid progress. We do not need government regulations, or we'll end up like Europe with China leading.

  • GauntletWizard a year ago

    The problem is that the OSI's definition of "Open Source" AI doesn't match their definition of "Open Source" software, or really anything.

    • anon373839 a year ago

      I think all of the hand-wringing about open weights vs. open source, and which term to use, is an ideological point people are getting stuck on. I don’t think it’s a real problem.

  • talldayo a year ago

    > It creates barriers only they can afford, limiting competition and diversity.

    I would argue this is inherent to the training and compute cost of all large language models.

    • binary132 a year ago

      Quantized Llama 3.1 can run on an Amazon GPU instance for $32/hr now

      • talldayo a year ago

        That's still prohibitively expensive for anyone that doesn't intend to make their money back. For training it's even more outrageous.

        • binary132 a year ago

          $64K isn’t even one developer’s salary.

          • talldayo a year ago

            And how many people can afford to pay a developer's salary after utilities and mortgage?

            • binary132 a year ago

              I think you’re missing my point: hiring someone is not that costly relative to training a billion-dollar model from scratch, so the barrier is definitely getting lower. An individual proprietor with a moderately successful business (or just business loans, let’s face it) can hire at least a couple of employees.

    • anon373839 a year ago

      Small models can be fine-tuned to perform specific tasks with similar accuracy to large models. Small models can be served for internal use with a VERY modest hardware outlay.

      There are also providers now that will let you upload low-rank adapters you have trained in top of open foundational models, so that you can use their efficient serverless infrastructure with your fine-tuned models. This requires even less capital.

      None of this would exist had OpenAI’s vision of centralized, locked-down API access become the reality.

  • b112 a year ago

    Comrade, open source is the wellspring of communism!

    (As Microsoft explained years ago)

  • splwjs a year ago

    that doesn't explain why the white house doesn't see the need to regulate it - if anything it gives them more reason to.

laweijfmvo a year ago

Would love to hear the White House explain to me what Open-Source AI is, in their own words.

  • burkaman a year ago

    This sounds sarcastic, but as is often the case the White House has quite competent people working on this who have put a lot of thought into it already: https://www.ntia.gov/federal-register-notice/2024/dual-use-f...

    Unfortunately it's past the deadline for you to add your own comments on this, but I'm sure there will be future RFCs if you have thoughts.

    • jahewson a year ago

      In the spirit of this pedantry I’d like to point out that the NTIA is not the White House.

      • ta1243 a year ago

        The NTIA "serves as the president's principal adviser" in its field.

      • burkaman a year ago

        That's true, but "White House" in this headline was actually referring to "Alan Davidson, an assistant secretary of the U.S. Commerce Department". I'm actually not sure where the article got the term "open source" from either, it says it's from a report but I don't know what report or who wrote it. It seems like they are using "White House" as a term for people in the executive branch working at the direction of the White House, which is common.

        When I say "the White House has quite competent people working on this" it's because the NTIA RFC I linked was based on a direct executive order from President Biden.

      • popalchemist a year ago

        Summary:

        On October 30, 2023, President Biden issued an Executive Order on “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence,” which directed the Secretary of Commerce, acting through the Assistant Secretary of Commerce for Communications and Information, and in consultation with the Secretary of State, to conduct a public consultation process and issue a report on the potential risks, benefits, other implications, and appropriate policy and regulatory approaches to dual-use foundation models for which the model weights are widely available. Pursuant to that Executive Order, the National Telecommunications and Information Administration (NTIA) hereby issues this Request for Comment on these issues. Responses received will be used to submit a report to the President on the potential benefits, risks, and implications of dual-use foundation models for which the model weights are widely available, as well as policy and regulatory recommendations pertaining to those models.

    • bravetraveler a year ago

      Piling bodies, sorry - creating committees never impedes an effort. Nuh uh, never! That was sarcastic.

      Here's a bit of help for them, development happens with our help or not. For every regulated repository that's removed, there's a mirror/fork.

      I don't really have a point... just having a little fun at the expense of the latest thing in the harbor. I'll honestly stay apprised

  • talldayo a year ago

    Seems pretty clear to me they're using it colloquialism for freely distributed weights.

    • jimsimmons a year ago

      Is that "umm ackshually .." bit really worth doing here?

      Open weights vs training code is not a distinction worth drawing for the average person

      • bravetraveler a year ago

        I'm just barely more informed than the average person... and I still don't think it is worth drawing.

        Perhaps I'm wrong, willing to discuss. I'm simple so my thoughts on this are also.

        Weights, training code, or... CAD files that just so happen to make child-size cages. Until I'm actually caging children - buzz off.

sandworm101 a year ago

Better question: Is it possible to restrict open source AI?

If it is open then everyone has access. Any restrictions would be akin to demanding that linux not use strong encryption or firefox implement content censorship.

  • grumpyinfosec a year ago

    Its possible to restrict DIY building of pretty much anything if your end goal was to stop people from doing something outside of their basement with it. I can't build my own open source coal fired power plant and except to sell power without the EPA coming to kill me. Same would be if i used a open source AI that violated some new consumer protection / anti fraud law if i choose to use it over the public internet / build it into a product. Hell you could probably go after the devs for being accessory if you really wanted to.

    The license really does nothing to protect your project from regulation its just that the government doesn't care about open source yet.

  • michaelt a year ago

    It hasn't proven possible yet but who knows what will be developed the future?

    A lot of open source efforts depend on big companies training million-dollar models and giving them away. These companies will often apply some censoring adjustment to the weights, which the open source community then undoes through fine tuning.

    But perhaps in the future new methods of censorship will be developed, which are radically harder to undo?

    And of course, there's always heavy-handed options available - if we can require hairdressers to hold professional licenses, we could require the same of anyone who wants to upload to huggingface or civitai.

  • Legend2440 a year ago

    Until 1996 they did put restrictions on strong encryption, and everyone had to implement weak 40-bit encryption.

    • perihelions a year ago

      Those restrictions were unconstitutional for the entire duration they were in effect.

  • p0w3n3d a year ago

    Intel Management Engine bug to find those computers which are using the free model?

rvnx a year ago

Might change after getting lobbying checks from Google, Microsoft and Apple.

  • talldayo a year ago

    Why? All three of those companies have an obvious and outspoken commitment to releasing models for free, they aren't trying to manipulate fear to sell a product like OpenAI.

    • chatmasta a year ago

      Microsoft is invested in OpenAI. Have they released any open weight models themselves?

      Apple is similarly partnering with OpenAI.

      Google has released nerfed versions of its Gemini models (Gemma 2B and Gemma 7B).

      It seems the only company truly setting an example for open weights is Meta. Hopefully the other companies realize that it’s in their best interest to do the same (as it seems Google is begrudgingly realizing with its nerfed releases).

      Eventually open weights will win, and if OpenAI (the most ironically named company) continues to rely on closed models as its moat, it will lose.

      • kkielhofner a year ago

        “AI” is much more than LLMs…

        Not only have all three of these companies released LLM model weights, Google and Microsoft especially have released nearly countless model architectures, weights, evaluation/training/inference/etc frameworks, toolkits, etc across an extremely wide spectrum.

        • rvnx a year ago

          They mainly offer open tooling to help you adopt their closed-source tech (e.g. RAG frameworks) but you don't get the freedom to do what you want with the model (which is the core).

          At some point, these big companies may even want to put more regulation or restrictions on their open-weights / open-source competitors. "Hey this is not validated for safety, you can use only Gemini"

          • talldayo a year ago

            They don't? Microsoft and Google both promote Open Source inferencing frameworks, and while both also have proprietary products I think it's completely dishonest to say they "mainly" support them. ONNX and Tensorflow have been supported longer and more productively than any of the closed-source AI frameworks Google or Microsoft offer.

            • rvnx a year ago

              Let’s see what the future holds.

              I’m feeling that the next step will be for regulators to start restricting more and more open-source/open-weight LLMs (and their equivalent Diffusion models), in the name of safety.

              Perhaps requiring certifications in professional use for example.

              I see it as a happy coincidence for these large players if this is the case, because only folks with large pockets and the right thoughts will be able to get certified.

              • talldayo a year ago

                You are currently posting in a thread where the root article is a press release detailing why open weight LLMs aren't going to get restricted by the incumbents. I also have seen no indication that any president-hopefuls intend to regulate the tech either. There are no demonstrated dangers yet, the value of exporting this digital snake-oil far outweighs the cost of other people asking ChatGPT how to make a fertilizer bomb.

                • rvnx a year ago

                  Yes exactly, I’m saying: good it is like that, because in the future it may change because of lobbyists (see above why)

      • roywiggins a year ago

        > Microsoft is invested in OpenAI. Have they released any open weight models themselves?

        Phi-3?

        https://azure.microsoft.com/en-us/products/phi-3

      • lsaferite a year ago

        > Microsoft is invested in OpenAI. Have they released any open weight models themselves?

        Do the Phi series models not count?

        • chatmasta a year ago

          Sure they do. I was genuinely asking.

          • Onawa a year ago

            Well then yes, I would definitely say Phi. And on top of that Microsoft has been releasing a ton of the tooling surrounding LLMs and AI as well, including Semantic Kernel.

      • apwell23 a year ago

        > Apple is similarly partnering with OpenAI.

        I think this is too generous of an interpretation for their deal.

    • yjk a year ago

      Really makes me wonder what the angle here is for them. With open source, I can somewhat understand given market standards become easier to hire for and you get feedback on your tooling. It seems unlikely that either will be true for open-weight models. It also seems unlikely they would be able to establish market domination and then increase prices if everything remains open.

      • sblom a year ago

        In all three cases, they own hardware (Apple in a slightly different sense) that they'd love for you to pay to run your favorite models on, open or closed.

ChrisArchitect a year ago

Source: https://apnews.com/article/ai-open-source-white-house-f62009...

bananapub a year ago

fascinating how successful the bullshit rebranding of "sometimes we upload a TB file of weights" as "open source" has gone for them politically. I can't really imagine they thought it would go this well. will be interesting to see how this unexpected boon of a loophole will change their strategy, particularly """OpenAI""", who didn't think to do give themselves enough cover.

_fat_santa a year ago

Say the government actually wanted to regulate open source AI, how exactly would they do that?

If the govt wants to regulate commercial AI, they can do that by going to the companies responsible for building that AI and say "do X, Y and Z or else we will punish you with A, B or C". But what do you say to a collective of developers that could be all around the globe? What stops that group from just saying "F* You" and continuing on?

war321 a year ago

Ya love to see it.

stonethrowaway a year ago

See my comment here on FBs strategic positioning: https://news.ycombinator.com/item?id=41090142

Updated link.

MP_1729 a year ago

Good. Effective Austruism is trying to destroy democratic institutions and it's likely a bigger threat to society than fascism. The pro-progress people needs to organize themselves to stop types like SBF from giving AI to China.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection