"Intel is re-architecting the fundamental building block of HPC systems by integrating the Intel Omni Scale Fabric into Knights Landing, marking a significant inflection and milestone for the HPC industry," said Charles ...
The U.S. Department of Energy's (DOE) Office of Science and National Nuclear Security Administration have awarded International Data Corporation (IDC) a three-year grant to conduct a full study of returns on investments (ROI) in high performance...
Computer scientists across the globe are working towards creating the first exascale supercomputers. However, energy consumption is likely to prove a major barrier to achieving this. Simon McIntosh-Smith of the University of Bristol, UK, reported at the workshop that the average power consumption of the top ten high-performance computing (HPC) systems in the Top500 list has increased five times over the last five years, while the average of the whole Top500 list has increased more than three times over the same period. “This isn’t very sustainable,” he says. “Nobody yet knows how to build a sustainable exaFLOPS supercomputer.”
“Getting the power isn’t necessarily the problem,” explains McIntosh-Smith. “The problem is really the cost: not just in terms of euros or dollars, but also in terms of CO2.” He adds: “A 10MW system costs about $10m per year to run in terms of power costs.”
McIntosh-Smith is also a member of Europe’s Mont-Blanc project, which aims ‘to produce a new type of computer architecture capable of setting future global HPC standards that will provide exascale performance using 15 to 30 times less energy’. He and his project colleagues believe that energy-efficient mobile processors could be the key to reducing power consumption in HPC.
HP Extends Benefits of ARM Architecture Into the Datacenter With New Servers MarketWatch The HP ProLiant Moonshot servers deliver high-density, ARM-based systems for hyperscale, datacenter environments to help customers improve application...
Cisco Targets Hyperscale With Modular UCS Iron EnterpriseTech With the M-Series machines, Cisco is not just creating yet another microserver platform aimed at scale-out workloads, but is actually doing a bit of engineering.
The “Convergence of Clouds, Grids and their Management” conference track is devoted to discussing current and emerging trends in virtualization, cloud computing, high-performance computing, Grid computing and cognitive computing.
At the same time, the company officials also talked about its upcoming 64-bit Cleantech I/O accelerator and server, which they said is designs for I/O intensive workloads in scale-out data center environments.
NVIDIA today announced that multiple server vendors are leveraging the performance of NVIDIA GPU accelerators to launch the world's first 64-bit ARM development systems for high performance computing...
Three computer makers have decided to combine Nvidia graphics processors with ARM-based CPU cores for high performance computing — a first for the ARM architecture that has so far dominated the cell phone market.
IBM today announced that it is making high performance computing (HPC), as part of technical computing, more accessible through the cloud for clients grappling with big data and other computationally intensive activities.
Researchers at Eindhoven University of Technology (TU/e) in the Netherlands and the University of Central Florida (CREOL), report in the journal Nature Photonics the successful transmission of a record high 255 Terabits/s over a new type of fiber allowing 21 times more bandwidth than currently available in communication networks. This new type of fiber could be an answer to mitigating the impending optical transmission capacity crunch caused by the increasing bandwidth demand.
Due to the popularity of Internet services and emerging network of capacity-hungry datacentres, demand for telecommunication bandwidth is expected to continue at an exponential rate. To transmit more information through current optical glass fibers, an option is to increase the power of the signals to overcome the losses inherent in the glass from which the fibre is manufactured. However, this produces unwanted photonic nonlinear effects, which limit the amount of information that can be recovered after transmission over the standard fiber.
The team at TU/e and CREOL, led by dr. Chigo Okonkwo, an assistant professor in the Electro-Optical Communications (ECO) research group at TU/e and dr. Rodrigo Amezcua Correa, a research assistant professor in Micro-structured fibers at CREOL, demonstrate the potential of a new class of fiber to increase transmission capacity and mitigate the impending 'capacity crunch' in their article that appeared yesterday in the online edition of the journal Nature Photonics.
The new fiber has seven different cores through which the light can travel, instead of one in current state-of-the-art fibers. This compares to going from a one-way road to a seven-lane highway. Also, they introduce two additional orthogonal dimensions for data transportation – as if three cars can drive on top of each other in the same lane. Combining those two methods, they achieve a gross transmission throughput of 255 Terabits/s over the fiber link. This is more than 20 times the current standard of 4-8 Terabits/s.
It's also likely that enterprise data centers will at least shrink in size: 58 percent expect that data centers will be half the size of current facilities or smaller, while 10 percent of participants believe the enterprise data center of 2025 will...
Download file | Duration: 18:47 | Size: 25.81M. On today's edition of Soundbite, we'll be talking with Gary Tyreman, CEO of Univa, about the status of Grid Engine and how it's made the transition from a successful open source ...
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.