WVC CIS Spring 2013
9 views | +0 today
Follow
WVC CIS Spring 2013
Computing Course
Curated by Ethan Bowe
Your new post is loading...
Your new post is loading...
Rescooped by Ethan Bowe from Digital Cinema Tools
Scoop.it!

After Effects and the state of GPU computing. By Phil Rhodes

After Effects and the state of GPU computing. By Phil Rhodes | WVC CIS Spring 2013 | Scoop.it

Posted by Phil Rhodes on January 17, 2013 • 

 

What exactly is GPU computing and what does and doesn't use it?

"As we saw back in December in Eyeon's informative video, GPU computing is a very powerful technique that, during the last year or two, has begun to break out of a niche. The original application of graphics cards was, obviously, video games. 3D rendering software such as Max and Lightwave have been using games-oriented graphics hardware to produce approximate previews of the scene for some time. More recently, the world's most popular operating system learned how to draw its user interface using more features on the graphics card, saving the CPU from spending its valuable time working out which window is on top.

 

What's new

What's new is the application of graphics processing units to calculations which are not, at least directly, graphics-related. Projects such as Folding@Home have used GPUs for simulation in medical research, and in the last couple of versions, some postproduction software has begun to apply the same technology to rendering effects. Even video games have followed the curve, and now commonly do physics simulation for both rigid objects and soft bodies, smoke, and liquid.

While this is all good, but it could be better.


To understand why, it's probably worth recapping how modern GPUs work and what they're therefore capable of doing."

...

 

RedSharkNews.com


Via Thierry Saint-Paul
more...
Bailey Spowart's curator insight, March 26, 2015 8:51 PM

Graphics cards’ use in digital design and video professional jobs and careers

Rescooped by Ethan Bowe from SynBioFromLeukipposInstitute
Scoop.it!

Neuristors and the future of brain-like computing

Neuristors and the future of brain-like computing | WVC CIS Spring 2013 | Scoop.it

Via Gerd Moe-Behrens
more...
Gerd Moe-Behrens's curator insight, March 17, 2013 3:37 PM

by
John Hewitt

"Hewitt Crane was a practical minded kind of guy. To help the world get a better feel for just how much oil it used in a year, he came up with the unit he called the cubic mile of oil (CMO) to considerable acclaim. Crane was actually one of the pioneers of computing. He was an early developer of magnetic core RAM, eye-tracking devices, pen input devices, and invented the first all-magnetic computers still finding extensive use for fail-safe systems in the military. Today, another kind of device he presciently envisioned back in 1960 is starting to attract attention – the neuristor.

A neuristor is the simplest possible device that can capture the essential property of a neuron – that is, the ability to generate a spike or impulse of activity when some threshold is exceeded. A neuristor can be thought of as a slightly leaky balloon that receives inputs in the form of puffs of air. When its limit is reached, it pops. The only major difference is that more complex neuristors can repeat the process again and again, as long as spikes occur no faster than a certain recharge period known as the refractory period.  Read more: http://www.itproportal.com/2013/01/02/the-neuristor-and-the-future-of-computing/#ixzz2NpLisMKp...";


http://bit.ly/141PbRJ