New Intel processors to contain DRM suggested by Hollywood studios
techradar.comI seem to remember last time someone tried something like this (Trusted Platform Module) a researcher burned the top off the chip and read the private key with a Focused Ion Beam workstation: http://www.youtube.com/watch?v=Qk73ye2_g4o (WARNING: extreme hacking ahead!)
You would've thought people would have worked this one out by now.
Who fancies running a HN pool on how long until the root keys are public? (I'm serious - how funny would that be. We just need someone with enough trust from the community to handle the money. And someone who lives where online gambling is not regulated...)
I would hope this constitute anticompetitive behavior. If only intel can decrypt movies and studios only make them available for intel procs, this can seriously hurt all other chipmakers.
AFAIK Intel and AMD cross-license everything.
And its not that tech runs just from Intel to AMD.
x64 is AMD tech that got adopted by Intel. Monolith multi-core technology is also something Intel adopted from AMD.
Intel wouldn't dare destroying AMD. Since AMD got created for sole reason to prevent Intel from being torn apart. But Intel will do everything to keep AMD's market share somewhere comfortably below 20%.
And what about ARM and Apple chips (like the A4)?
What about them?
Is Intel supposed to just implement it for them or something?
Or are they supposed to get to their own terms with them studios?
Well we're talking about Intel and AMD and I thought we should also remember there are more chip makers out there that would have an interest in decoding encrypted content.
But I seriously doubt that ARM chipmakers not getting access to DRM core would constitute anti-competitive behaviour.
Especially since these are completely different markets who do most stuff differently. Thats what I wanted to point out - if they want their chips to play DRM content they'll have to negotiate it themselves with the Hollywood. I also strongly suspect that this is not an exclusive pact.
Since nowadays content producers are very well aware that more and more (probably the majority?) of content is consumed via - non x86 architecture devices.
How is this different from Netflix only making movies available via Windows DRM? Netflix has expanded, but initially it was the same deal.
And it remains entirely at Netflix's discretion whom they trust. I agree it's a Bad Thing, but the content industry is entirely predicated on artificial monopolies.
If I'm not mistaken, Netflix uses multiple types of DRM (for iOS, PCs, Wii, PS3, etc). The Windows DRM is for PCs and the XBOX, possible some others (Roku might use it, I think the NXP-chip uses Microsoft DRM).
This is actually the problem with Netflix on Android - Android doesn't have any DRM standard, so it's up to the hardware manufacturer to include it.
That wasn't always the case.
And even so, this is just another DRM tool that Netflix will use, unless they opt to only use Intel's built in DRM (which would be strange given the wide competition Intel's hardware DRM faces from TPMs and the like.
I expect it'd be covered by the same agreements that let AMD and Via clone, say, the latest SSE instructions.
I for one can't wait for the high-quality new film rips to show up on BitTorrent once this is inevitably cracked.
Why wait? Blu-Ray is about as high-quality as it gets, and nobody is having any trouble pirating those.
Oh, never mind. I misread the article - I thought it was promising films before they come out on DVD.
My point still stands though - there is absolutely no way this doesn't get cracked.
As long as watchable images are displayed on a screen, someone will figure out a way to "archive" them for later.
Technically, what is this specific DRM? Is this something other than HDCP?
Could someone here please explain the likely mechanics of how this DRM will work? I must presume that decrypted content would not be available to the OS and would ostensibly only be exported from the GPU via some video output (HDMI w/ HDCP, etc.)?
Presuming this is the case, how would key management happen? Would every chip have its own unique keypair? If not, an attack similar to the one linked to here by JohhnieCache would render this scheme useless. If each keypair is unique, how would a particular content provider know that a public key was actually one paired with a private key only contained in Intel's silicon? Would Intel provide a registry of all valid public keys to content providers? If there is no such central registry, would the public keys themselves be signed by Intel so that content providers could be certain of their origin?
Please correct any wrong assumptions on my part. I know little of cryptography compared to many here, but I am frustrated by how little technical commentary has accompanied the articles I have read about this new processor family.
As one of the comments suggests, this seems doomed to failure. Paying for a streaming version of a movie doesn't seem very desirable, as it's inherently non-portable. There's a good chance I want to watch somewhere I don't have real[0] broadband. If it's too difficult to do that, I can always find a torrent. This situation does not seem likely to change.
[0] Mobile "broadband" doesn't count. In most places, it is entirely unsuitable for streaming high-def video and will be for several years.
There is plenty of market for streamed movies.
My TV provider offers films on demand at £2-4 per viewing. The convenience of being able to watch immediately, and cheaply outweighs the choice of buying the DVD/BR and being able to watch multiple times.
Quick cost comparison:
1) Stream it for £4. Film was rubbish. Total cost = £4.
2) Stream it for £4. Film was great. Wait a few months until price falls. Purchase DVD at discount for £5-£10. Total cost = £9-£14
3) After release purchase for £15. Total cost = £15.
Of course this depends on the economics of your specific location and the cost differences, but personally I think I spend less in total on films because I very rarely buy physical media until the prices fall.
Streaming to a PC fits a very similar market profile, the only difference being it caters to those who build their entertainment system around a PC rather than a regular TV, and I think the lines between those as two separate systems are going to increasing blur in the coming years.
>Paying for a streaming version of a movie doesn't seem very desirable
Right, that's why Netflix Instant is such a failure.
Um, if Netflix Instant charged users per movie, it would've been a failure. The reason people love Netflix Instant is: one low monthly fee; unlimited, unmetered use of the service.
This is night and day different from what Hollywood wants to do, and what Intel is trying to enable them to do: force consumers to buy physical discs, or force consumers to rent from a hardware-enforced DRM'd digital kiosk.
OK, good point.
Thats because Netflix Instant isn't tied to a chip. It works on many devices.
Even with my ‘real’ (out-in-the-country 5.5mbit) broadband I’m still not yet convinced by streaming films. Nothing like a stutter to jerk you out of your immersion.
That's weird. The net connection at my dad's house is ridiculously bad. You'll be lucky if you can download at 60kb/sec, and this is what they call broadband (AT&T in Yorba Linda, CA). Yet Netflix movies stream without a hitch or a hiccup.
UK so no Netflix: I’m basing my previous comment on the TV catch-up services (iPlayer, ITV Player, 4OD). Maybe pay-for services commit to better QoS?
Amazon has had a video on demand service for quite some time now. It works on Windows and OS X and is supported by several devices like Roku.
Mental note: don't buy Intel chip's with DRM
I'm curious how users on newegg and related sites will take this and how much it will hurt Intel's sales.
I gotta wonder about this and their relationship with Apple. Apple can't really use it because they also have a large ARM presence and Intel isn't going to give it to ARM. Heck, Microsoft is in the same boat. Who exactly is going to be able to use this? set-top boxes?
It's never a good idea to tie yourself to one DRM provider. Assuming that Apple wanted hardware DRM support (which they don't seem to care about), they could use Intel DRM on Macs and TrustZone on ARM.
It seems to me that this will actually become a desireable feature, if only because the masses are ignorant of DRM. I wonder if other chip makers will be forced to "offer" DRM in order to compete.
And so the closed web begins! I am sure these new chips will have magical "flaws" in them that prevent regular content from being shown for unforeseen reasons, and they will also prevent copying DVDs and using certain share apps. It was only a matter of time. Prepare to have to pay every time you want to watch content, instead of being able to buy individual copies of DVDs, prepared to get milked for your dollars every time you log on!
The DIVX revolution has been killed.
I am willing to bet a lot of money on the opposite side. My guess is that all existing x86/x86_64 code will continue to work just as well as it did before.
My guess is that CPU microcode designed to detect that h264 decoding being offloaded to the GPU was from a Usenet post instead of from Hollywood would be difficult, if not impossible, to implement. (My point: the CPU doesn't even see bits from the video to decode anymore. So it probably can't fuck them up, even if it wanted to, which it doesn't.)
Would this make it politically OK for Netflix to release a Linux streaming client? I read somewhere that the lack of a standard DRM lib was the blocking issue on an Android client.
Isn't DRM on noninteractive content (not games) doomed to failure?
Yes.
If a human can view it, you can always, always, always copy it, reproduce it, and share it. No matter how much HDCP you throw at it, no matter what specialized hardware you devote to it...in the words of Mr. Universe..."you can't stop the signal."
(And all the King's horses, and all the King's men, couldn't put the industry's shitty paranoid ineffective wannabe DRM back together again.)
My favorite quote on the matter:
> For Alex, the impossibility of making digital information copy-proof is a central truth of our age: something to be explained, and then re-explained, to judges, reporters, and businesspeople, in amicus curiae briefs and interviews on NPR. For me, it follows from the fact that the set of n-bit strings constitutes an orthogonal basis for Hilbert space.
From Axel Boldt's idea diary,
1/24/2007 10:13 PM Defeat any copy protection on video/audio: play the content on a certified software player in a virtual machine, copy material from the virtual screen/loudspeaker.
Not that easy - any virtual machine, that allows copying material from virtual output devices will break the chain of trust and the certified player will refuse to play.
The problem with the "chain of trust" is that most of the links are held by Chinese companies who bid the lowest. All it takes is one employee with access to the key to leak it, and then everyone has it.
Note the ready availability of HDCP-strippers, for example.
You won't have much luck with that, I tried to view some online flash video in a virtual machine (we all know how fucked flash is on Linux), which lost the audio/movie sync after about 20 minutes.
It being in a virtual machine doesn't necessarily have anything to do with the problems; Audio/video sync in lots of flash content doesn't even work properly outside of virtual machines.
I watched the rest of the video on a normal windows machine, it didn't have any problems.
I'd put the odds over 80% that it's just because the VM wasn't fast enough. You then have two options. One is to get a faster machine, the other is to run the VM in slow motion while you save. Obviously the second isn't an option for realtime viewing.
Unless the game cannot effectively be played offline, I think that'd also be doomed.