pixels and pictures
12.8K views | +0 today
Follow
 
Scooped by Philippe J DEWOST
onto pixels and pictures
Scoop.it!

Smallest 4K camera captures 21 frames per second –

Smallest 4K camera captures 21 frames per second – | pixels and pictures | Scoop.it

A new tiny camera captures better-than-HD video. The latest device from Point Grey is an “in an ice-cube sized, low-cost package,” says the British Columbia-base industrial camera maker.
The FL3-U3-88S2C captures 4,096 x 2,160 video with an 8.8 megapixel Sony IMX1221 Exmor R sensor. “The impressive 4K2K resolution combined with the ease of USB 3.0 and the camera’s small size makes the new Flea3 suitable for a variety of high resolution color applications including automatic optical inspection, ophthalmology, interactive multimedia, and broadcast,” the company says.
The Flea3 camera measures 29 x 29 x 30mm, and uses USB 3 connectivity. It is $945.

more...
No comment yet.
pixels and pictures
Exploring the digital imaging chain from sensors to brains
Your new post is loading...
Your new post is loading...
Scooped by Philippe J DEWOST
Scoop.it!

Moon over the Crest

Moon over the Crest | pixels and pictures | Scoop.it

Canon EOS60D + 70-200 f/4 L - Serre Chevalier Valley, French Alps

Philippe J DEWOST's insight:

Proud of this one

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Le Château de Chambord encerclé par les eaux

Le Château de Chambord encerclé par les eaux | pixels and pictures | Scoop.it
Les intempéries qui touchent la France depuis plusieurs jours ont presque fait du château de Chambord (Loir-et-Cher) une île. La moitié des routes d’accès sont fermées, le système de sécurité incendie est hors d’usage, tous les parcs de stationnement sont inaccessibles ou sous l’eau. Par sécurité, le monument est fermé au public ce mercredi. Le rêve de François Ier d'un château surgissant des eaux, celles qu’il aurait voulu détourner de la Loire, est exaucé.
Philippe J DEWOST's insight:
François 1er en avait rêvé, Météo France l'a fait !
more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Moonset over Mars (Mount Sharp between sol 610 and 613)

Moonset over Mars (Mount Sharp between sol 610 and 613) | pixels and pictures | Scoop.it

Moonset over Mount Sharp

This image combines a single Mastcam frame taken of Phobos behind Mt. Sharp on sol 613 (April 28, 2014) with three images from a 360-degree mosaic acquired during the afternoon of sol 610 (April 24, 2014) to extend the foreground view and balance the image composition.
The moonset view came from one sol; Justin extended the mosaic with some images taken a previous sol. "The sol 610 frames were adjusted to match the color of the Sol 613 image. As these additional frames were in the opposite direction of the Sun, very few shadows are present, ideal for matching the post-sunset lighting conditions of the sol 613 image," he writes.

A bit of Phobos trivia: Curiosity's view here is to the east, but this is indeed a moonset, not a moonrise. Phobos orbits so close to Mars that it moves around Mars faster than Mars rotates, and consequently it appears to rise in the west and set in the east!

Philippe J DEWOST's insight:

Would such a moonset drive you to wish dying on Mars, just not on impact ?

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Introducing Facebook Surround 360: An open, high-quality 3D-360 video capture system

Introducing Facebook Surround 360: An open, high-quality 3D-360 video capture system | pixels and pictures | Scoop.it
Introducing Facebook Surround 360: An open, high-quality 3D-360 video capture system
  • Facebook has designed and built a durable, high-quality 3D-360 video capture system.
  • The system includes a design for camera hardware and the accompanying stitching code, and we will make both available on GitHub this summer. We're open-sourcing the camera and the software to accelerate the growth of the 3D-360 ecosystem — developers can leverage the designs and code, and content creators can use the camera in their productions.
  • Building on top of an optical flow algorithm is a mathematically rigorous approach that produces superior results. Our code uses optical flow to compute left-right eye stereo disparity. We leverage this ability to generate seamless stereoscopic 360 panoramas, with little to no hand intervention. 
  • The stitching code drastically reduces post-production time. What is usually done by hand can now be done by algorithm, taking the stitching time from weeks to overnight. 
  • The system exports 4K, 6K, and 8K video for each eye. The 8K videos double industry standard output and can be played on Gear VR with Facebook's custom Dynamic Streaming technology.

Today we announced Facebook Surround 360 — a high-quality, production-ready 3D-360 hardware and software video capture system.

 

In designing this camera, we wanted to create a professional-grade end-to-end system that would capture, edit, and render high-quality 3D-360 video. In doing so, we hoped to meaningfully contribute to the 3D-360 camera landscape by creating a system that would enable more VR content producers and artists to start producing 3D-360 video. 

Defining the challenges of VR capture

When we started this project, all the existing 3D-360 video cameras we saw were either proprietary (so the community could not access those designs), available only by special request, or fundamentally unreliable as an end-to-end system in a production environment. In most cases, the cameras in these systems would overheat, the rigs weren't sturdy enough to mount to production gear, and the stitching would take a prohibitively long time because it had to be done by hand. So we set out to design and build a 3D-360 video camera that did what you'd expect an everyday camera to do — capture, edit, and render reliably every time. That sounds obvious and almost silly, but it turned out to be a technically daunting challenge for 3D-360 video.

Many of the technical challenges for 3D video stem from shooting the footage in stereoscopic 360. Monoscopic 360, using two or more cameras to capture the whole 360 scene, is pretty mainstream. The resultant images allow you to look around the whole scene but are rather flat, much like a still photo. 

However, things get much more complicated when you want to capture 3D-360 video. Unlike monoscopic video, 3D video requires depth. We get depth by capturing each location in a scene with two cameras — the camera equivalent of your left eye and right eye. That means you have to shoot in stereoscopic 360, with 10 to 20 cameras collectively pointing in every direction. Furthermore, all the cameras must capture 30 or 60 frames per second, exactly and simultaneously. In other words, they must be globally synchronized. Finally, you need to fuse or stitch all the images from each camera into one seamless video, and you have to do it twice: once from the virtual position for the left eye, and once for the right eye.

This last step is perhaps the hardest to achieve, and it requires fairly sophisticated computational photography and computer vision techniques. The good news is that both of these have been active areas of research for more than 20 years. The combination of past algorithm research, the rapid improvement and availability of image sensors, and the decreasing cost of memory components like SSDs makes this project possible today. It would have been nearly impossible as recently as five years ago.

The VR capture system

With these challenges in mind, we began experimenting with various prototypes and settled on the three major components we felt were needed to make a reliable, high-quality, end-to-end capture system:

  • The hardware (the camera and control computer)
  • The camera control software (for synchronized capture)
  • The stitching and rendering software 

All three are interconnected and require careful design and control to achieve our goals of reliability and quality. Weakness in one area would compromise quality or reliability in another area.

Additionally, we wanted the hardware to be off-the-shelf. We wanted others to be able to replicate or modify our design based on our design specs and software without having to rely on us to build it for them. We wanted to empower technical and creative teams outside of Facebook by allowing them full access to develop on top of this technology.

The camera hardware

As with any system, we started by laying out the basic hardware requirements. Relaxing any one of these would compromise quality or reliability, and sometimes both.

Camera requirements: 

  • The cameras must be globally synchronized. All the frames must capture the scene at the same time within less than 1 ms of one another. If the frames are not synchronized, it can become quite hard to stitch them together into a single coherent image.
  • Each camera must have a global shutter. All the pixels must see the scene at the same time. That's something, for example, cell phone cameras don't do; they have a rolling shutter. Without a global shutter, fast-moving objects will diagonally smear across the camera, from top to bottom.
  • The cameras themselves can’t overheat, and they need to be able to run reliably over many hours of on-and-off shooting.
  • The rig and cameras must be rigid and rugged. Processing later becomes much easier and higher quality if the cameras stay in one position.
  • The rig should be relatively simple to construct from off-the-shelf parts so that others can replicate, repair, and replace parts.

We addressed each of these requirements in our design. Industrial-strength cameras by Point Grey have global shutters and do not overheat when they run for a long time. The cameras are bolted onto an aluminum chassis, which ensures that the rig and cameras won't bounce around. The outer shell is made with powder-coated steel to protect the internal components from damage. (Lest anyone think an aluminum chassis or steel frame is hard to come by, any machining shop will do the honors once handed the specs.)

Philippe J DEWOST's insight:

Amazing design and open source market approach by facebook reminding us that, even in (360 VR) imaging, if software is "eating the world", hardware is yet still shaping it. #HardwareIsNotDead

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Cambridge-based image fusion pioneer Spectral Edge announces successful £1.5m funding round

Cambridge-based image fusion pioneer Spectral Edge announces successful £1.5m funding round | pixels and pictures | Scoop.it
Cambridge-based image fusion pioneer attracts major backing to commercialise product portfolio

Spectral Edge, (http://www.spectraledge.co.uk/) today announced the successful completion of an oversubscribed £1.5 million second funding round. New lead investors IQ Capital and Parkwalk Advisors, along with angel investors from Cambridge Angels, Wren Capital, Cambridge Capital Group and Martlet, the Marshall of Cambridge Corporate Angel investment fund, join the Rainbow Seed Fund/Midven and Iceni in backing the company.

Spun out of the University of East Anglia (UEA) Colour Lab, Spectral Edge has developed innovative image fusion technology. This combines different types of image, ranging from the visible to invisible (such as infrared and thermal), to enhance detail, aid visual accessibility, and create ever more beautiful pictures. 

Spectral Edge’s Phusion technology platform has already been proven in the visual accessibility market, where independent studies have shown that it can transform the TV viewing experience for the estimated 4% of the world’s population that suffers from colour-blindness. It enhances live TV and video, allowing colour-blind viewers to differentiate between colour combinations such as red-green and pink-grey so that otherwise inaccessible content such as sport can be enjoyed. 

The new funding will be used to expand Spectral Edge’s team, increase investment in sales and marketing, and underpin development of its product portfolio into IP-licensable products and reference designs. Spectral Edge is mainly targeting computational photography, where blending near-infrared and visible images gives higher quality, more beautiful results with greater depth. Other applications include security, where the combination of visible and thermal imaging enhances details to provide easier identification of people filmed on surveillance cameras, as well as visual accessibility through its Eyeteq brand.

"Spectral Edge is a true pioneer in the field of photography. They are set to disrupt and transform the imaging sector, not just within consumer and professional photography, but also across a broad range of business sectors,” said Max Bautin, Managing Partner at IQ Capital. "Backed by a robust catalogue of IP, Spectral Edge’s technology enables individuals and companies to take pictures and record videos with unparalleled detail by taking advantage of non-visible information like near-infra red and heat. We are proud to add Spectral Edge to our portfolio of companies. We back cutting-edge IP-rich technology which pushes the boundaries but also has a proven track record of experiencing stable growth, and Spectral Edge fits that mould perfectly."

“We are delighted to support Professor Graham Finlayson and his team at Spectral Edge,” said Alastair Kilgour CIO Parkwalk Advisors. “We believe Phusion could prove to be a substantial enhancement to the quality of digital imaging and as such have significant commercial prospects.” 

Spectral Edge is led by an experienced team that combines deep technical and business experience. It includes Professor Graham Finlayson, Head of Vision Group and Professor of Computing Science, UEA, Christopher Cytera (managing director) and serial entrepreneur Dr Robert Swann (chairman).

Philippe J DEWOST's insight:

Looks like imsense founder and IQ Capital are doing it again #BeenThereDoneThat. Congratulations Graham !

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

The Mystery of Prince Rupert's Drop at 130,000 FPS

The Mystery of Prince Rupert's Drop at 130,000 FPS | pixels and pictures | Scoop.it
Destin of ‘Smarter Every Day‘ explores the mystery of Prince Rupert’s drop using an ultra high speed camera
Philippe J DEWOST's insight:

130.000 frames per second allow you to capture phenomena moving at 1500 meters per second and understand the dynamics of glass breaking. 

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Techart Unveils the World's First Autofocus Adapter for Manual Focus Lenses

Techart Unveils the World's First Autofocus Adapter for Manual Focus Lenses | pixels and pictures | Scoop.it
The Guangzhou, China-based company Techart has officially unveiled the Techart PRO AF adapter, world’s first autofocus adapter for manual focus lenses. The adapter, which was teased last month, actually lets you autofocus with lenses that don’t have that ability.
Philippe J DEWOST's insight:
Interesting innovation coming from China for those who have kept their precious "gems"
more...
No comment yet.
Rescooped by Philippe J DEWOST from cross pond high tech
Scoop.it!

Magic Leap Raises $794 Million And Announces "Mixed Reality Lightfield"

Magic Leap Raises $794 Million And Announces "Mixed Reality Lightfield" | pixels and pictures | Scoop.it

Magic Leap raised $794 million in new funding and CEO Rony Abovitz posted a blog suggesting the secretive company is moving closer toward a product, writing “we are setting up supply chain operations, manufacturing.”

Chinese e-commerce company Alibaba led the round and Joe Tsai, Alibaba’s Executive Vice Chairman, is getting a seat on the board. The announcement roughly confirms a December report suggesting the company was raising money in this ballpark.

The Series C round puts the Florida startup’s funding to date close to $1.4 billion.

 

Magic Leap also seems to have named its technology “Mixed Reality Lightfield” with subtle language in the blog post linked above that might be commentary about current VR technology, which isn’t able to perfectly reproduce what your eyes see in the real world.

“It comes to life by following the rules of the eye and the brain, by being gentle, and by working with us, not against us,” Abovitz wrote about the company’s technology. “By following as closely as possible the rules of nature and biology.”

Abovitz previously suggested Rift-like VR headsets have a history of “issues that near-eye stereoscopic 3d may cause” and that “we have done an internal hazard and risk analysis….on the spectrum of hazards that may occur to a wide array of users.”

more...
Philippe J DEWOST's curator insight, February 3, 5:06 AM

The staggering amount raised by MagicLeap is all but virtual and makes Oculus Rift acquisition price look almost "reasonable".#SelfReminder: need to update my "Brief History of Interfaces" slide deck

Scooped by Philippe J DEWOST
Scoop.it!

Photographer Captures Powerful Waves on Lake Erie as Liquid Mountains

Photographer Captures Powerful Waves on Lake Erie as Liquid Mountains | pixels and pictures | Scoop.it
Dave Sandford is a professional sports photographer of 18 years whose hometown is London, Ontario, Canada. Over the past 4 weeks, for 2 to 3 days per week, Sandford has been driving 45 minutes to Lake Erie, spending up to 6 hours a day photographing the lake.The photos are awe-inspiring: Sandford gets in the water and shoots the powerful choppy waves in a way that makes them look like epic mountain peaks that are exploding into the atmosphere.
Philippe J DEWOST's insight:
Beautiful, powerful, and sometimes frightening shots taken on a lake where sometimes waves look like mountains. "Se méfier de l'eau qui dort" as we French say...
more...
ZikoShop's comment, December 31, 2015 4:41 PM
Photo Editing
Learn photo Editing.
33 professional photoshop tutorials
http://tinyurl.com/phmjhjn
Scooped by Philippe J DEWOST
Scoop.it!

Affordable camera reveals hidden details invisible to the naked eye | UW Today

Affordable camera reveals hidden details invisible to the naked eye | UW Today | pixels and pictures | Scoop.it
an affordable camera technology being developed by the University of Washington and Microsoft Research might enable consumers of the future to tell which piece of fruit is perfectly ripe or what’s rotting in the fridge.The team of computer science and electrical engineers developed HyperCam, a lower-cost hyperspectral camera that uses both visible and invisible near-infrared light to “see” beneath surfaces and capture unseen details. This type of camera is typically used in industrial applications and can cost between several thousand to tens of thousands of dollars.In a paper presented at the UbiComp 2015 conference, the team detailed a hardware solution that costs roughly $800, or potentially as little as $50 to add to a mobile phone camera. They also developed intelligent software that easily finds “hidden” differences between what the hyperspectral camera captures and what can be seen with the naked eye.
Philippe J DEWOST's insight:
Hypercam illustrates how imaging will go beyond the visible and turn smartphones into scanners.
more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Newly Launched EyeNetra Mobile Eye-Test Device Could Lead To Prescription Virtual-Reality Screens

Newly Launched EyeNetra Mobile Eye-Test Device Could Lead To Prescription Virtual-Reality Screens | pixels and pictures | Scoop.it

After five years of development and about 40,000 tests worldwide, the smartphone-powered eye-test devices developed by MIT spinout EyeNetra is coming to hospitals, optometric clinics, optical stores, and even homes nationwide.

But on the heels of its commercial release, EyeNetra says it’s been pursuing opportunities to collaborate with virtual-reality companies seeking to use the technology to develop “vision-corrected” virtual-reality displays.

“As much as we want to solve the prescription glasses market, we could also [help] bring virtual reality to the masses,” says EyeNetra co-founder Ramesh Raskar, an associate professor of media arts and sciences at the MIT Media Lab who co-invented the device.

The device, called Netra, is a plastic, binocular-like headset. Users attach a smartphone, with the startup’s app, to the front and peer through the headset at the phone’s display. Patterns, such as separate red and green lines or circles, appear on the screen. The user turns a dial to align the patterns and pushes a button to lock them in place. After eight interactions, the app calculates the difference between what the user sees as “aligned” and the actual alignment of the patterns. This signals any refractive errors, such as nearsightedness, farsightedness, and astigmatism. The app then displays the refractive powers, axis of astigmatism, and pupillary distance required for eyeglasses prescriptions.

In April, the startup launched Blink, an on-demand refractive test service in New York, where employees bring the startup's optometry tools, including the Netra device, to people’s homes and offices. In India, EyeNetra has launched Nayantara, a similar program to provide low-cost eye tests to the poor and uninsured in remote villages, far from eye doctors. Both efforts used EyeNetra’s suite of tools, now available for eye-care providers worldwide.

According to the World Health Organization, uncorrected refractive errors are the world’s second-highest cause of blindness. EyeNetra originally invented the device for the developing world — specifically, for poor and remote regions of Africa and Asia, where many people can’t find health care easily. India alone has around 300 million people in need of eyeglasses.

Philippe J DEWOST's insight:

Interesting crossroads between VR and healthcare, and a sound reminder of how incredibly powerful smartphones have become !

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

New Horizons Finds Blue Skies and Water Ice on Pluto

New Horizons Finds Blue Skies and Water Ice on Pluto | pixels and pictures | Scoop.it

The first color images of Pluto’s atmospheric hazes, returned by NASA’s New Horizons spacecraft last week, reveal that the hazes are blue.

“Who would have expected a blue sky in the Kuiper Belt? It’s gorgeous,” said Alan Stern, New Horizons principal investigator from Southwest Research Institute (SwRI), Boulder, Colorado.

The haze particles themselves are likely gray or red, but the way they scatter blue light has gotten the attention of the New Horizons science team. “That striking blue tint tells us about the size and composition of the haze particles,” said science team researcher Carly Howett, also of SwRI. “A blue sky often results from scattering of sunlight by very small particles. On Earth, those particles are very tiny nitrogen molecules. On Pluto they appear to be larger — but still relatively small — soot-like particles we call tholins.”

Scientists believe the tholin particles form high in the atmosphere, where ultraviolet sunlight breaks apart and ionizes nitrogen and methane molecules and allows them to react with one another to form more and more complex negatively and positively charged ions. When they recombine, they form very complex macromolecules, a process first found to occur in the upper atmosphere of Saturn’s moon Titan. The more complex molecules continue to combine and grow until they become small particles; volatile gases condense and coat their surfaces with ice frost before they have time to fall through the atmosphere to the surface, where they add to Pluto’s red coloring.

Philippe J DEWOST's insight:

Fascinated by NASA's ability to convey and share their findings through these stunning images.

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Eye-Popping Veil Nebula Colored Image released by NASA

Eye-Popping Veil Nebula Colored Image released by NASA | pixels and pictures | Scoop.it

Eight thousand years ago, around the time humans were just getting good at farming, a star 20 times as big as our sun blew up about 2,100 light years from here. This feathery ribbon is the aftermath.

NASA just released this eye-popping image of the Veil Nebula taken by the Hubble Space Telescope. The nebula, which looks a little like the Nexus from a certain forgettable Star Trek movie, spans 110 light years across. It's so huge that this image is merely a mosaic of Hubble pictures that together cover about two light years. As is typically the case with Hubble's unforgettable nebula pictures, the colors you see actually represent the stuff it's made of. Here, red is hydrogen, green is sulfur, and blue is oxygen.

Philippe J DEWOST's insight:

A supernova blew up 8000 years ago. H2, O2, and Sulfur form the red, blue and green stuff. NASA has created a flyover visualization of this stunning structure that reminds us that we are all made of stars...

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

The Human Eye is So Sensitive It Can Detect a Single Photon

The Human Eye is So Sensitive It Can Detect a Single Photon | pixels and pictures | Scoop.it

You may think you’re no good at seeing in the dark, but your eyes are actually incredibly sensitive. In fact, according to a new study, the human eye is so sensitive it can detect even a single photon of light!

 

The study was conducted by a team at Rockefeller University who used an innovative (and complicated) technique to reliably fire a single high energy photon directly at participants’ retinas. For their part, the participants just had to tell them when they saw something and rate how confident they were about the sighting.

The results were surprising to say the least. We’re talking about the smallest particle of light, and the results from the study show that people were able to accurately determine when a photon was fired 51.6% of the time (60% when they were very confident)—a statistically significant percentage that couldn’t possibly result from subjects guessing their way through it. What’s more, subjects were more likely to detect a second photon if it was fired less than 10 seconds after the first.

Philippe J DEWOST's insight:

About “the absolute limits of human vision” paper released on nature.com . At imsense Ltd. we called this eye-fidelity™ ... 

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

This is What the History of Camera Sales Looks Like with Smartphones Included | PetaPixel

This is What the History of Camera Sales Looks Like with Smartphones Included | PetaPixel | pixels and pictures | Scoop.it
Few months ago, we shared a chart showing how sales the camera market have changed between 1947 and 2014. The data shows that after a large spike in the late 2000s, the sales of dedicated cameras have been shrinking by double digit figures each of the following years. Mix in data for smartphone sales, and the chart can shed some more light on the state of the industry.
Philippe J DEWOST's insight:
This chart is eye-opening even if not so suprising when you realize that the billionth iPhone is expected before the end of 2016. Feeling blessed and glad to have been a participant (twice: with Realeyes3D and then imsense) to such a massive rise.
more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

The Untold Story of Magic Leap, the World’s Most Secretive Startup

The Untold Story of Magic Leap, the World’s Most Secretive Startup | pixels and pictures | Scoop.it

Among the first people Abovitz hired at Magic Leap was Neal Stephenson, author of the other seminal VR anticipation, Snow Crash. He wanted Stephenson to be Magic Leap’s chief futurist because “he has an engineer’s mind fused with that of a great writer.” Abovitz wanted him to lead a small team developing new forms of narrative. Again, the mythmaker would be making the myths real.

Philippe J DEWOST's insight:

An amazing, deep, complete and abundant analysis of the Magic Leap phenomenon, the subtleties between Augmented, Mixed, and Virtual realities, and a tribute to Neal Stephenson and its Snow Crash. Welcome to the MetaVerse (and you ain't seen nothin' yet)

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Ostagram

Ostagram | pixels and pictures | Scoop.it

These user created images are the product of an art technique known as Inceptionism, using neural networks to generate a single mind-bending picture from two source images.

 

The images are possible thanks to DeepDream software, which finds and enhances patterns in images by a process known as algorithmic pareidolia. It was pioneered by Google and was originally code-named Inception after the film of the same name. 

Philippe J DEWOST's insight:

I didn't know anything about Inceptionism and Pareidolia until I bumped into these incredible images on Ostagram...

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Single Picture Explains How Aperture, Shutter Speed, and ISO Work In Photography

Single Picture Explains How Aperture, Shutter Speed, and ISO Work In Photography | pixels and pictures | Scoop.it
If you’re a beginner photographer, it can be helpful to have a simple guide that helps you understand the different settings that you can toggle on your DSLR camera. While this helpful exposure chart by Daniel Peters at Fotoblog Hamburg won’t explain HOW the optics of photography work, it will show you exactly what happens when you tweak your camera’s settings.
Philippe J DEWOST's insight:
Smart !
more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

This Glass Disc Can Store 360 TB of Your Photos for 13.8 Billion Years

This Glass Disc Can Store 360 TB of Your Photos for 13.8 Billion Years | pixels and pictures | Scoop.it
Scientists have created nanostructured glass discs that can storage digital data for billions of years.Researchers at the University of Southampton announced this week that they’ve figured out how to store huge amounts of data on small glass discs using laser writing. They call it five dimensional (5D) digital data because in addition to the position of the data, the size and orientation plays a role too.The glass storage discs can hold a whopping 360 terabytes each, are stable at temperatures up to 1,000°C (1,832°F), and are expected to keep the data intact for 13.8 billion years at room temperature (anything up to 190°C, or 374°F).
Philippe J DEWOST's insight:
Disc Rot And Data Rot being apparently solved, what about access time ?
more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

What Is Your Score at Deepart Visual Turing Test ?

What Is Your Score at Deepart Visual Turing Test ? | pixels and pictures | Scoop.it
You will be shown 10 pairs of pictures. In each pair, one is painted by a human and another one is generated by artificial intelligence based on a photo and a style of a painter. Click on a picture painted by a human.
Philippe J DEWOST's insight:

Interesting, easy and quick, funny image recognition Turing test by depart.io .

Wondering if Facebook and Google AI algorithms tried to pass the test and if so, how much they scored...

(And by the way I scored 8/10 so I still feel quite human)

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Happy 15th Birthday to VLC/VideoLan !

Happy 15th Birthday to VLC/VideoLan ! | pixels and pictures | Scoop.it

Technically, today is the 15th anniversary of the relicensing of all the VideoLAN software to the GPL license, as agreed by the École Centrale Paris, on February 1st, 2001.

If you've been to one of my talks, (if you haven't, you should come to one), you know that the project that became VideoLAN and VLC, is almost 5 years older than that, and was called Network 2000.

Moreover, the first commit on the VideoLAN Client project is from August 8th 1999, by Michel Kaempf had 21275 lines of code already, so the VLC software was started earlier in 1999.

However, the most important date for the birth of VLC is when it was allowed to be used outside of the school, and therefore when the project was GPL-ized: February 1st, 2001.

Facts and numbersSince then, only on VLC, we've had around

  • 700 contributors,
  • 70000 commits,
  • at least 2 billion downloads,
  • hundreds of millions users!


And all that, mostly with volunteers and without turning into a business!


We have now ports for Windows, GNU/Linux, BSD, OS X, iPhone and iPad, Android, Solaris, Windows Phones, BeOS, OS/2, Android TV, Apple TV, Tizen and ChromeOS.

Philippe J DEWOST's insight:

Amazing achievement by Jean-Baptiste Kempf & l'équipe de @videolan !

more...
Scooped by Philippe J DEWOST
Scoop.it!

Un photographe alpiniste réalise des clichés incroyables dans des conditions extrêmes

Un photographe alpiniste réalise des clichés incroyables dans des conditions extrêmes | pixels and pictures | Scoop.it
Robert Bösch est un guide de montagne Suisse, mais aussi un célèbre photographe travaillant actuellement en collaboration avec la marque Mammut. Cette entreprise, qui vend des équipements pour les sports de montagne, a souhaité rendre hommage à l'un des pionniers de l'Alpinisme : Edward Whymper. Robert Bosch ainsi que des alpinistes vont mettre en scène la première ascension du Cervin, la montagne la plus connue de Suisse, réalisée 150 ans plus tôt par Edward Whymper. Les lumières rouges le long de l'arête Hörnli retracent le parcours de l'alpiniste anglais. À l'époque, ce fût un véritable défi contre la mort au nom de l'exploration.
Philippe J DEWOST's insight:
Laquelle préférez-vous ? Aucune n'a été photoshoppée d'après l'article ...
more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

One of Earth's driest places is now a pink flower wonderland

One of Earth's driest places is now a pink flower wonderland | pixels and pictures | Scoop.it
Parts of Chile's Atacama Desert, one of the driest places on Earth, look like a psychedelic wonderland as pink mallow flowers bloom in the valley, following a year of unprecedented rain.Massive downpours in March gave parts of the desert its first taste of rain in almost seven years. Some areas got as much as seven years' worth of rain in just 12 hours.
Philippe J DEWOST's insight:
I thought that only ESO giant telescopes were (slowly) blossoming in the Atacama...
more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Xiaomi launches Android-powered Mi TV 3 with 60-inch 4K display for less than $800

Xiaomi launches Android-powered Mi TV 3 with 60-inch 4K display for less than $800 | pixels and pictures | Scoop.it
Xiaomi, on Monday, unveiled the Mi TV 3, a 60-inch TV with 4K display at a price point of RMB 4,999 ($780). Much like the Mi TV 2 and previous generation television sets from the company, the Mi TV 3 has been launched in the company's home market China.

The Mi TV 3 sports an LG-made 60-inch 4K display with lossless quality, MEMC, and color gamut. The display comes with full aluminum frame running along the sides. At 11.6mm, the Mi TV 3 is pretty sleek too. It is powered by a MStar 6A928 processor consisting of Cortex-A17 and Mali T760 GPU. An 8GB eMMC 5.0 flash is used as the internal storage. The company hasn't revealed any information about the RAM module.

On the connectivity side, there are three HDMI ports, two USB ports, one VGA port, one Ethernet port, one AV in, an output for Subwoofer, and a standard RF Modulator. On the audio front, the Mi TV 3 features Virtual Surround technology, deeper bass, and dialogue enhance tool with auto volume balance support. The company says that acoustics for the TV has been put together by Grammy award winner Luca Bugnardi, and former research head at Philips acoustics Wang Fuyu.

The speaker bar, which sells separately at RMB 999 ($160), is made of four mid-range 2.5-inch subwoofer, and interestingly comes with the main board of the TV. The company says that it has separated the processor board from the screen, as doing this significantly reduces the replacement cost of internal components. This also increases the life cycle of the TV, the company claimed.
Philippe J DEWOST's insight:

104 ppc (pixels per $ cent) looks like an interesting metric

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

Comment voyait-on l'an 2000 en 1900 ? La réponse en 11 dessins étonnants

Comment voyait-on l'an 2000 en 1900 ? La réponse en 11 dessins étonnants | pixels and pictures | Scoop.it
Au début des années 1900, le peintre français Jean-Marc Côté s’associe à d’autres artistes pour imaginer dans une série de toiles comment l’on vivra en France un siècle plus tard. Imprimées pour être présentées lors l’Exposition universelle de Paris, elles ont ensuite été transformées en carte postales. Wedemain.fr publie les plus étonnantes de ces visions tantôt farfelues, tantôt prémonitoires.
En raison de difficultés financières, ces cartes de Jean-Marc Côté n'ont jamais été vendues. De nombreuses années plus tard, elles ont été exhumées par l'auteur de science-fiction Isaac Asimov, qui en a publié et commenté certaines dans son oeuvre "Futuredays: A Nineteenth Century Vision of the Year 2000"Aujourd'hui, elles sont passées dans le domaine public.
Philippe J DEWOST's insight:

Au delà du camping-car (la dernière image) et d'une appétence prononcée pour la robomécanisation, je crois que le plus étonnant reste le postier volant, surtout lorsqu'il fait du surplace...

more...
No comment yet.
Scooped by Philippe J DEWOST
Scoop.it!

All Along the Fractures - NASA Unveils Incredible Images of Martian Sand Dunes

All Along the Fractures - NASA Unveils Incredible Images of Martian Sand Dunes | pixels and pictures | Scoop.it
The High Resolution Imaging Science Experiment (HiRISE) camera aboard NASA's Mars Reconnaissance Orbiter often takes images of Martian sand dunes to study the mobile soils. These images provide information about erosion and movement of surface material, about wind and weather patterns, even about the soil grains and grain sizes. However, looking past the dunes, these images also reveal the nature of the substrate beneath.

Within the spaces between the dunes, a resistant and highly fractured surface is revealed. The fractured ground is resistant to erosion by the wind, and suggests the material is bedrock that is now shattered by a history of bending stresses or temperature changes, such as cooling, for example.
Philippe J DEWOST's insight:

This slug shape is actually a sand dune on Mars captured by the HiRISE camera.

more...
No comment yet.