Settings

Theme

New Amazon EC2 GPU Instance Type

phx.corporate-ir.net

47 points by isb 12 years ago · 43 comments

Reader

jeffbarr 12 years ago

There's more technical info in my post at http://aws.typepad.com/aws/2013/11/build-3d-streaming-applic...

ck2 12 years ago

Very weird they do not use the amazon domain for that and yet it looks exactly like amazon.

Teaching consumers bad habits.

thenomad 12 years ago

How do the GPUs on this compare with NVidia desktop GPUs? Anyone know?

Also, very exciting that they're supporting GPU cloud rendering - that's going to be big for 3D.

HeXetic 12 years ago

> making it ideally suited for video creation services, 3D visualizations, streaming graphics-intensive applications ...

And, presumably, cracking hashes!

  • earlz 12 years ago

    I've never used Amazon EC2, but with this kind of application, I might have to give it a try. Buying a $300 graphics card just to try some GPU programming is ridiculous.

    • profquail 12 years ago

      You don't need to buy a $300 graphics card to experiment with GPU programming.

      The current and previous generation Intel CPUs (Haswell and Ivy Bridge, respectively) have on-die GPUs which support OpenCL: http://software.intel.com/en-us/articles/intel-sdk-for-openc...

      AMD's APUs are quite cheap (~$100) CPU+GPU designs similar to those in the upcoming PS4 and XBox One (though the retail APUs are somewhat less powerful). They've been more-or-less designed specifically around the needs of a heterogenous OpenCL application.

      Finally, the last several generations of NVidia cards all support both CUDA and OpenCL; the newer cards do support additional features though. You should be able to pick up a low-end, recent-edition Nvidia GPU for roughly $100.

      The new g2.2xlarge instances are $0.650/hour, and the existing cg1.4xlarge are $2.100/hour; so it may make sense to experiment on AWS a bit, then buy your own card for long-term use if you decide to spend more time doing GPU programming.

    • oakwhiz 12 years ago

      I take it that you don't play video games on your desktop machine, then.

  • hga 12 years ago

    As I understand it, scrypt is designed to mitigate this: https://en.wikipedia.org/wiki/Scrypt

    • seiji 12 years ago

      Except the interface to scrypt is kinda weird so nobody writes sane library bindings for it. (I tried once. I didn't get very far.)

  • prezjordan 12 years ago

    GPUs were rendered useless by ASIC miners, no?

    EDIT: Sorry for jumping on the Bitcoin hype train too soon! Many other uses for cracking hashes.

  • lowkeykiwi 12 years ago

    lets go mining.

beamatronic 12 years ago

I'm trying Folding@Home on it now. Looks like it might not recognize the GPU.

22:42:58:WU02:FS00:0x15:GPU memtest failure 22:42:58:WU02:FS00:0x15: 22:42:58:WU02:FS00:0x15:Folding@home Core Shutdown: GPU_MEMTEST_ERROR 22:42:58:WU02:FS00:0x15:Starting GUI Server 22:42:59:WARNING:WU02:FS00:FahCore returned: GPU_MEMTEST_ERROR (124 = 0x7c)

  • jeffbarr 12 years ago

    If you are confident that this should be working, post a note to the EC2 forum so that we can investigate.

lelf 12 years ago

http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using_clu...

kayoone 12 years ago

With stuff like this it looks like the devices we use could be only streaming clients in the future and wont require a lot of processing power but excellent network connectivity.

That goes a bit against the trend in web development to move much of the processing to the client side so i wonder where this will go.

Really high performance streaming of apps/games could revert the trend of making everything browser based in favor of streamed native apps.

jewel 12 years ago

I work on some opengl software that renders slideshows, and this is precisely what we need. We've used the bigger CG1.4xlarge nodes in the past but they are very expensive for what we're doing. The lower price on this (65¢/hr instead of $2.40) is going to be much more manageable for us.

dsugarman 12 years ago

this is huge beyond graphics, new levels of performance can be achieved with GPGPU for data intensive startups. i would love to see someone build a company around this.

warrenmiller 12 years ago

An ideas whether this would make a decent bitcoin miner?

  • dmm 12 years ago

    The bitcoin network difficulty is rising so fast that even the first-gen ASICs are becoming obsolete.

    For example if you have a good Radeon HD7970, you can get about .8 GH/s. Based on the rate of difficulty increase the 7970 would mine about 0.02 BTC in all of November 2013 and 0.01 BTC in December 2013 and < 0.01/month after that.

    For various reasons Nvidia cards are slower at BTC mining than AMD. The fastest Nvidia card, the Tesla S2070, can only hash about 0.750 GH/s.

    Even 60GH/s ASIC miners will be earning < 0.10 BTC per month by March 2014. In August 2013 a 60GH/s miner would make ~0.8 BTC PER DAY. That's how quickly the difficulty is increasing.

    At this point no GPU would make a decent bitcoin miner, except as a hobby.

  • varelse 12 years ago

    I'd guess it'd hit about ~75% of a GTX 680 at this task.

dsugarman 12 years ago

can they preload http://wiki.postgresql.org/wiki/PGStrom?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection