A new video standard enables a fourfold increase in the resolution of TV screens, and an MIT chip was the first to handle it in real time.
It took only a few years for high-definition televisions to make the transition from high-priced novelty to ubiquitous commodity — and they now seem to be heading for obsolescence just as quickly. At the Consumer Electronics Show (CES) in January, several manufacturers debuted new ultrahigh-definition, or UHD, models (also known as 4K or Quad HD) with four times the resolution of today’s HD TVs.
In addition to screens with four times the pixels, however, UHD also requires a new video-coding standard, known as high-efficiency video coding, or HEVC. Also at CES, Broadcom announced the first commercial HEVC chip, which it said will go into volume production in mid-2014.
At the International Solid-State Circuits Conference this week, MIT researchers unveiled their own HEVC chip. The researchers’ design was executed by the Taiwan Semiconductor Manufacturing Company, through its University Shuttle Program, and Texas Instruments (TI) funded the chip's development.
Although the MIT chip isn’t intended for commercial release, its developers believe that the challenge of implementing HEVC algorithms in silicon helps illustrate design principles that could be broadly useful. Moreover, “because now we have the chip with us, it is now possible for us to figure out ways in which different types of video data actually interact with hardware,” says Mehul Tikekar, an MIT graduate student in electrical engineering and computer science and one of the paper's co-authors.
Via Dr. Stefan Gruenwald