Just when you thought your favourite childhood science hero, Sir David Attenborough, couldn't get any more awesome, some genius over at the Lovin Dublin Facebook page has edited his pithy narration over the top of Pokémon Go game play.
The result is both hilarious and nostalgic, seeing as the game is probably the closest thing today's generation will get to actually sleuthing wild animals National Geographic-style.
The video really has to be seen to be appreciated, but the best part of all is when even Attenborough is sick of goddamn Zubats.
"Bats, with their fluttering zigzag flights, are not easy targets," he explains in the footage above. "That is one bat that will not return to the roost tonight."
For everyone (anyone?) who hasn't tried playing Pokémon Go just yet, it's not too late.
It might just be a game, but it's reportedly helping people to treat their depression and anxiety by getting them out of the house and socialising.
Plus, you get to walk around your neighbourhood, phone in hand, pretending to be a zoologist on the hunt for the next rare species. And, if you're lucky, you might even find it.
Iris recognition, retina scanning, fingerprints, voice recognition—all of these show promise. But there’s one security tool that’s secure, effortless and available now. It’s a special kind of face recognition that’s available on some Windows 10 computers—those equipped with an Intel RealSense camera, such as the Surface Pro 4.
RealSense is actually a sophisticated set of three sensors: one each for infrared, color and 3-D perception. Some laptops come with the RealSense camera built in or you can buy one as an external gadget that plugs into your computer’s USB jack.
The feature is called Windows Hello. Actually, Hello can log you into your PC using fingerprint, iris orfacial recognition—but the facial thing is by far the most convenient. Once it’s set up, when you sit down in front of your computer, it recognizes your face and logs you in instantly. You can’t fool it with a photograph, a 3-D model of your head or even an identical twin. Thanks to the infrared camera, you can log yourself in even in the dark.
Why do people seek out information about an ex's new relationships, read negative Internet comments and do other things that will obviously be painful? Because humans have an inherent need to resolve uncertainty, according to a recent study in Psychological Science. The new research reveals that the need to know is so strong that people will seek to slake their curiosity even when it is clear the answer will hurt.
In a series of four experiments, behavioral scientists at the University of Chicago Booth School of Business and the Wisconsin School of Business tested students' willingness to expose themselves to aversive stimuli in an effort to satisfy curiosity. For one trial, each participant was shown a pile of pens that the researcher claimed were from a previous experiment. The twist? Half of the pens would deliver an electric shock when clicked.
Twenty-seven students were told which pens were rigged; another 27 were told only that some were electrified. When left alone in the room, the students who did not know which ones would shock them clicked more pens and incurred more jolts than the students who knew what would happen. Subsequent experiments replicated this effect with other stimuli, such as the sound of fingernails on a chalkboard and photographs of repulsive insects.
The drive to discover is deeply ingrained in humans, on par with the basic drives for food or sex, says Christopher Hsee of the University of Chicago, a co-author of the paper. Curiosity is often considered a good instinct—it can lead to new scientific advances, for instance—but sometimes such inquiry can backfire. “The insight that curiosity can drive you to do self-destructive things is a profound one,” says George Loewenstein, a professor of economics and psychology at Carnegie Mellon University who has pioneered the scientific study of curiosity.
Morbid curiosity is possible to resist, however. In a final experiment, participants who were encouraged to predict how they would feel after viewing an unpleasant picture were less likely to choose to see such an image. These results suggest that imagining the outcome of following through on one's curiosity ahead of time can help determine whether it is worth the endeavor. “Thinking about long-term consequences is key to mitigating the possible negative effects of curiosity,” Hsee says. In other words, don't read online comments.
"For a period of time, SD card data recovery has been a tough problem to deal with, but now, it is no longer difficult."
SD Card is short for Security Digital Card, also known as Security Digital Memory Card. It is actually a new generation of memory device based on the semiconductor flash memory device. SD card has been widely using in a variety of portable devices, including digital cameras, personal digital assistants (PDA) and multimedia players.
In 1999, Panasonic put forward the concept of SD card, and Toshiba and SanDisk completed the substantive development. In 2000, these three companies finished the establishment of SD Association (Secure Digital Association, briefly called SDA), and it attracted a large number of companies to participate in this organization: IBM, Microsoft, Motorola, NEC, Samsung, and so on.
For years, scientists have been experimenting with "biobots." Example include insects fitted with various electronic systems that can harvest kinetic energy from their wings, those that use the Xbox's Kinect interface to follow a set path, and those put to work mapping building interiors. Now, engineers at Washington University in St. Louis are developing a method to tap into the highly-tuned olfactory system of locusts, using them like tiny cyborg sniffer dogs to detect the smell of chemicals used in explosives.
Leading the project is Baranidharan Raman, Associate Professor of Biomedical Engineering at WUSTL, who has previously conducted research into the sensory systems of locusts. Those studies determined how the insects' brains light up in response to olfactory stimuli, and found that even when clouded by other smells, locusts are able to single out odors they've been trained to identify.
Since the locusts' natural chemical-sensing system is far more powerful than any artificial ones, Raman plans to harness the power of that nose, attaching miniature electronics to the insects that monitors their brains and determines, through their neural activity, which chemical compounds the locusts are detecting.
Technology can be awkward. Our pockets are weighed down with ever-larger smartphones that are a pain to pull out when we’re in a rush. And attempts to make our devices more easily accessible with smart watches have so far fallen flat. But what if a part of your body could become your computer, with a screen on your arm and maybe even a direct link to your brain?
Artificial electronic skin (e-skin) could one day make this a possibility. Researchers are developing flexible, bendable and even stretchable electronic circuits that can be applied directly to the skin. As well as turning your skin into a touchscreen, this could also help replace feeling if you’ve suffered burns or problems with your nervous system.
The simplest version of this technology is essentially an electronic tattoo. In 2004, researchers in the US and Japan unveiled a pressure sensor circuit made from pre-stretched thinned silicon strips that could be applied to the forearm. But inorganic materials such as silicon are rigid and the skin is flexible and stretchy. So researchers are now looking to electronic circuits made from organic materials (usually special plastics or forms of carbon such as graphene that conduct electricity) as the basis of e-skin.
Typical e-skin consists of a matrix of different electronic components – flexible transistors, organic LEDs, sensors and organic photovoltaic (solar) cells – connected to each other by stretchable or flexible conductive wires. These devices are often built up from very thin layers of material that are sprayed or evaporated onto a flexible base, producing a large (up to tens of cm2) electronic circuit in a skin-like form.
Imagine a far flung land where you can catch a ride from the Jackie Chan bus stop to a restaurant called Translate Server Error, and enjoy a hearty feast of children sandwiches and wife cake all washed down with some evil water.
If such a rich lunch gets stuck in your gnashers, you'll be pleased to know there are plenty of Methodists on hand to remove your teeth.
And if by this point you've had enough of the bus, fly home in style on a wide-boiled aircraft. But whatever you do, please remember that when you land at the airport, eating the carpet is strictly prohibited.
No, I haven't gone mad. These are all real-world examples of howlers by auto-translation software.
Joking aside, poor translations can have big implications for firms who run the risk of offending customers and losing business, or at least looking very amateurish.
We are the distracted generations, wasting hours a day checking irrelevant emails and intrusive social media accounts.
And this "always on" culture - exacerbated by the smartphone - is actually making us more stressed and less productive, according to some reports.
"Something like 40% of people wake up, and the first thing they do is check their email," says Professor Sir Cary Cooper of Manchester Business School, who has studied e-mail and workplace stress.
"For another 40%, it's the last thing they do at night."
The Quality of Working Life 2016 report from the Chartered Management Institute earlier this year found that this obsession with checking emails outside of work hours is making it difficult for many of us to switch off.
And this is increasing our stress levels.
So what can we do about it? Smarter working
The more enlightened firms have been stepping in to help. In 2012, Volkswagen began shutting off employees' email when they are off shift.
Daimler has allowed its workers to have all the work emails they receive while on holiday automatically erased. And France's new labour law, enacted a few weeks ago, encourages all companies to take similar measures.
Dave Coplin, Microsoft UK's chief envisioning officer, believes artificial intelligence tools will learn when we are busy and block alerts, waiting until we're less busy before bringing us the most relevant or interesting messages.
The Taiwanese electronics manufacture Asus has unveiled a home robot called Zenbo that can talk, control your home and provide assistance when needed – all for the cost of a top-end smartphone.
The $599 (£410) robot rolls around on two wheels in the shape of a vacuum cleaner ball with cameras an oblong head extruding from the top with a colour touchscreen displaying a face with emotions. It is capable of independent movement, can respond to voice commands and has both entertainment protocols for keeping kids amused and home care systems to help look after older people.
Jonney Shih, the Asus chairman, said: “For decades, humans have dreamed of owning such a companion: one that is smart, dear to our hearts, and always at our disposal. Our ambition is to enable robotic computing for every household.”
It's probably not something you'd say to a person writhing in agony on the floor, but physical pain can have its benefits. It is after all how kids learn to be wary of hot surfaces and carpenters to hit nails on the head. Researchers are now adapting this exercise in self-learning to an artificial nervous system for robots, a tool they believe will better equip these machines to avoid damage and preserve their – and our – well-being.
We send robots into all kinds of situations we wouldn't dare set foot in ourselves. From Fukushima's melted down nuclear plants to landmine-littered conflict zones, their insensitivity to pain and danger is indeed what can make them so useful. Flipping this on its head and making them feel as we do seems counter-productive, but scientists from Leibniz University of Hannover believe it could make robots more durable and safer for us to be around.
Researchers Johannes Kuehn and Professor Sami Haddadin have developed a pain-reflex controller for a BioTac fingertip sensor fitted to a Kuka robotic arm. They built a nervous robot-tissue model that is based on human skin, which helps the system determine how much pain should be felt by the machine in response to differing levels of force.
Visitors to a Pizza Hut in Asia will soon be able to place an order, ask about nutritional info and pay for their meal without even speaking to a member of staff, or at least a human one. A robot that can interact with customers, like a glorified self-checkout, is to be piloted at the restaurant.
While we've seen promising prototypes of computers that conform to the contours of human wrists and forearms, the technology isn't quite ready for mainstream adoption yet. But this hasn't stopped one forward-thinking team of researchers from coming up with a new way to power these wearable electronics, developing a soft, millimeter-scale battery that can be stretched over the skin like a band-aid.
VR hardware is already capable of tracking your head, your hands, your eyes and in some cases, your feet, but Veeso is claimed to be the first VR headset to capture your face and transmit your expressions – and as a result, your emotions – onto a virtual avatar in real time. With it, the company is emphasizing emotional connections through chat apps and social games like poker.
Like the Samsung Gear VR and Google Cardboard, Veeso is a smartphone-based VR headset, which the company claims is compatible with Android and iOS devices. Unlike those aforementioned headsets, however, Veeso has two infrared cameras mounted on it to capture the wearer's facial expressions.
One of these cameras is located between the eyes to capture pupil movements, eyebrows, and how open or closed the eyelids are, while the second hangs off the bottom of the unit, taking in the jaw, lips and mouth. Together, from what we see in the videos, they seem to do a pretty solid job of covering the whole face and mimicking the facial expressions on a digital avatar in real time.
In his 1963 book God and Golem, the founder of the cybernetics movement Norbert Wiener suggested a compelling thought experiment. Imagine cutting off someone’s hand, he wrote, but leaving intact the key muscles and nerves. Theoretically, a prosthesis could connect directly both to nerves and muscles, giving the subject control of the replacement organ as if it were real.
So far so sensible: this scenario was a reasonable extrapolation at the time, and is close to becoming a reality today. Wiener, however, went further. Having imagined an artificial hand able to replace its original, he wondered why we should not now imagine the addition of an entirely new kind of limb or sensory organ? “There is,” he wrote, “a prosthesis of parts which we do not have and which we never have had.” There was no need to stop at nature. Human-machine integration could in theory blur its boundaries well beyond replacement.
It’s 14 July 2016, and between typing this paragraph and the last I dashed outside with my iPhone to catch a Pokémon lurking next to a tree (a cute orange lizard: Charmander, weight 8.5kg, height 0.6m).
What would Wiener have made of this? I suspect he would have been delighted. While I’m playing Pokémon, my smartphone functions much like a sensory prosthesis. In order to move my avatar around a map, I must move myself. When I get close enough to a target, I hold the device up and through its camera see something superimposed on the world that would otherwise be invisible. It’s like having a sixth sense. My Pokémon-gathering escapades place me somewhere between a cyborg and a stamp collector.
Houseplants have never been known as great conversationalists, but it's possible we just can't hear what they're saying. Swiss company, Vivent SARL, is hoping to rectify that with its Phytl Signs device that picks up the tiny electrical signals emitted by plants and broadcasts them through a speaker. The ultimate goal is to translate what the plants are actually "saying."
The system, which is currently the subject of a crowdfunding campaign, features two receptors – a stake that is inserted into the soil next to the plant, and a clip that gently connects to a leaf. These measure the voltage coming from the plant, which feeds into a signal processor. From there the plant-speak is output through a built-in speaker. A smartphone app can also receive raw data from a plant, allowing analysis of the signals using data analysis software.
Unlike current plant monitors on the market that measure environmental metrics like soil moisture and sunlight, the Phytl Signs device is claimed to pick up on whether your plant is thriving or stressed, active or quiet, or besieged by pests. The plant responds immediately to a change in lighting or the cutting of a leaf with a spike in sound, which is an electronic howl akin to a theramin. But decoding what the audio output means is still being worked out by the company.
To that end, the company encourages device owners to share their data with an online community of fellow users, allowing the company to crowdsource the data to help them decode and translate the plant signals so they can be understood.
Children with a rare neurological disease were recently given the chance to walk for the first time thanks to a new robotic exoskeleton. These devices – which are essentially robotic suits that give artificial movement to a user’s limbs – are set to become an increasingly common way of helping people who’ve lost the use of their legs to walk. But while today’s exoskeletons are mostly clumsy, heavy devices, new technology could make them much easier and more natural to use by creating a robotic skin.
Exoskeletons have been in development since the 1960s. The first one was a bulky set of legs and claw-like gloves reminiscent of the superhero, Iron Man, designed to use hydraulic power to help industrial workers lift hundreds of kilogrammes of weight. It didn’t work, but since then other designs for both the upper and lower body have successfully been used to increase people’s strength, help teach them to use their limbs again, or even as a way to interact with computers using touch or “haptic” feedback.
These devices usually consist of a chain of links and powered joints that align with the user’s own bones and joints. The links are strapped securely to the user’s limbs and when the powered joints are activated they cause their joints to flex. Control of the exoskeleton can be performed by a computer – for example if it is performing a physiotherapy routine – or by monitoring the electrical activity in the user’s muscles and then amplifying the force they are creating.
While working as a professor in the sensory-motor systems lab at the Swiss Federal Institute of Technology in Zurich (ETH), Robert Riener noticed a need for assistive devices that would better meet the challenge of helping people with daily life. He knew there were solutions, but that it would require motivating developers to rise to the challenge.
So, Riener created Cybathlon, the first cyborg Olympics where teams from all over the world will participate in races on Oct. 8 in Zurich that will test how well their devices perform routine tasks. Teams will compete in six different categories that will push their assistive devices to the limit on courses developed carefully over three years by physicians, developers and the people who use the technology. Eighty teams have signed up so far.
Riener wants the event to emphasize how important it is for man and machine to work together—so participants will be called pilots rather than athletes, reflecting the role of the assistive technology.
“The goal is to push the development in the direction of technology that is capable of performing day-to-day tasks. And that way, there will an improvement in the future life of the person using the device,” says Riener.
Here’s a look at events that will be featured in the first cyborg Olympics.
There are three things you can be sure of in life: death, taxes – and lying. The latter certainly appears to have been borne out by the UK’s recent Brexit referendum, with a number of the Leave campaign’s pledges looking more like porkie pies than solid truths.
But from internet advertising, visa applications and academic articles to political blogs, insurance claims and dating profiles, there are countless places we can tell digital lies. So how can one go about spotting these online fibs? Well, Stephan Ludwig from the University of Westminster, Ko de Ruyter from City University London’s Cass Business School, Mike Friedman of the Catholic University of Louvain, and yours truly have developed a digital lie detector – and it can uncover a whole host of internet untruths.
In our new research, we used linguistic cues to compare tens of thousands of emails pre-identified as lies with those known to be truthful. And from this comparison, we developed a text analytic algorithm that can detect deception. It works on three levels. 1. Word use
Keyword searches can be a reasonable approach when dealing with large amounts of digital data. So, we first uncovered differences in word usage between the two document sets. These differences identify text that is likely to contain a lie. We found that individuals who lie generally use fewer personal pronouns, such as I, you, and he/she, and more adjectives, such as brilliant, fearless, and sublime. They also use fewer first-person singular pronouns, such as I, me, mine, with discrepancy words, such as could, should, would, as well as more second-person pronouns (you, your) with achievement words (earn, hero, win).
Fewer personal pronouns indicate an author’s attempt to dissociate themselves from their words, while using more adjectives is an attempt to distract from the lie through a flurry of superfluous descriptions. Fewer first-person singular pronouns combined with discrepancy words indicate a lack of subtlety and a positive self-image, while more second-person pronouns combined with achievement words indicate an attempt to flatter recipients. We therefore included these combinations of search terms in our algorithm.
Registration has just opened up for an all new US$5 million XPrize, this time focusing on getting humans collaborating better with artificial intelligence to solve major global issues. Unlike previous competitions, this XPrize, sponsored by IBM's Watson division, doesn't feature a set of pre-determined goals, but instead challenges teams to come up with their own.
You might be familiar with XPrize from its ongoing Google Lunar effort, which is seeing small teams from around the world compete to successfully land a robot on the Moon. It's a seriously ambitious project, and one that has seen rivals team up in the hope of winning out against the competition.
This new project is totally different to the Lunar XPrize, but it's no less ambitious. Aside from the different focus, the big difference here is that the competition is "open," with teams being given the opportunity to pick their own direction. Each participant will be deciding which issues to tackle, and working out how to reach their own goals.
Participants could focus on anything from fixing healthcare or improving education, to taking on major green energy and environmental issues, to any other issue they choose to tackle. XPrize thinks the competition could have a big impact, harnessing the potential of artificial intelligence to solve some of humanity's greatest challenges.
"The IBM Watson AI XPrize will stir innovation and empower a global group of developers, entrepreneurs, and organizations to push the boundaries of human-machine collaboration, forever changing for the better the way in which we live and work." said IBM Watson vice president Stephen Gold.
Apple has quietly announced it will soon start selling solar energy alongside its iPhones and Macbooks.
The company has created a subsidiary called 'Apple Energy' LLC, registered in Delaware but run from its Cupertino headquarters.
While it's not clear exactly what Apple is planning to do with this subsidiary, the firm's latest Federal Energy Regulatory Commission filing, spotted by 9to5Mac, suggests it is thinking of selling surplus solar electricity generated by hundreds of solar projects in its farms in Cupertino and Nevada. However, this should all become clear soon, as the firm requested permission from FERC to begin operations just 60 days after it filed its application on June 6.
Apple showing a green side is by no means anything new. The company announced in 2013 that its data centres have moved onto 100 per cent renewable energy rather than coal. The firm's data centre in Maiden, North Carolina which hosts Apple's iCloud service, now gets its energy from a 100-acre solar farm and fuel cell installations.
Neatly folding clothes into organised piles is a time-consuming and often frustrating task. This machine claims to be able to do it for you.
Produced by a San Francisco startup, the FoldiMate is said to be able to fold clothes as soon as they come out of a washing machine.
The machine, which is currently in the prototype stage, will be released in 2017 and is expected to cost around $850 (£590).
It is claimed the machine can complete three different processes - folding, streaming, and de-wrinkling. This is possible for shirts, trousers and towels.
The firm said it takes up to 10 seconds to fold each item, while de-wrinkling will take 20 to 30 seconds per time. Depending on fabric thickness, it is possible for the machine to take 10 to 30 items per load.
"FoldiMate is like having a friend that folds the laundry for you," the company explains on its website.
"Even if you fold faster than clipping, folding a full load of laundry is akin to using a dishwasher over doing the dishes by hand."
Naturally, as with all new products, the machine will come Wi-Fi enabled.
However, it won't be able to solve all clothes-folding woes. The specifications on the company's website say that it will not be able to automatically fold smaller or larger items. It also won't be able to replace ironing of dress shirts.
The website adds: "FoldiMate will fold and treat most of your laundry (e.g. shirts, pants, towels). Except for large items like linen or small items like underwear or socks."
As the computation and communication circuits we build radically miniaturize (i.e. become so low power that 1 picoJoule is sufficient to bang out a bit of information over a wireless transceiver; become so small that 500 square microns of thinned CMOS can hold a reasonable sensor front-end and digital engine), the barrier to introducing these types of interfaces into organisms will get pretty low. Put another way, the rapid pace of computation and communication miniaturization is swiftly blurring the line between the technological base that created us and the technological based we’ve created. Michel Maharbiz, University of California, Berkeley, is giving an overview (june 16, 2016) of recent work in his lab that touches on this concern. Most of the talk will cover their ongoing exploration of the remote control of insects in free flight via implantable radio-equipped miniature neural stimulating systems.; recent results with neural interfaces and extreme miniaturization directions will be discussed. If time permits, he will show recent results building extremely small neural interfaces they call “neural dust,” work done in collaboration with the Carmena, Alon and Rabaey labs.
Radical miniaturization has created the ability to introduce a synthetic neural interface into a complex, multicellular organism, as exemplified by the creation of a “cyborg insect.”
“The rapid pace of computation and communication miniaturization is swiftly blurring the line between technological base we’ve created and the technological base that created us,” explained Dr. Maharbiz. “These combined trends of extreme miniaturization and advanced neural interfaces have enabled us to explore the remote control of insects in free flight via implantable radio-equipped miniature neural stimulating systems.”
Lots of prosthetic feet are available, but most are built to fit men’s shoes, and none can adjust to a heel more than 2 inches high. That’s less than the average women’s heel height in the US, according to the creators of a new, taller option.
Five mechanical engineering students from Johns Hopkins University and their advisors have developed what would be the first non-custom-made prosthetic foot on the market that can adapt to heels 4 inches or higher.
Some 2,100 American women have lost a leg or foot in military service, and more are entering combat assignments, so the demand for a prosthesis that accommodates a range of shoes is expected to grow. The team—who created what they call the Prominence as their senior project—hope their work can help.
Facebook has apologized for banning a photo of a plus-sized model and telling the feminist group that posted the image that it depicts “body parts in an undesirable manner”.
Cherchez la Femme, an Australian group that hosts popular culture talkshows with “an unapologetically feminist angle”, said Facebook rejected an advert featuring Tess Holliday, a plus-sized model wearing a bikini, telling the group it violated the company’s “ad guidelines”.
After the group appealed against the rejection, Facebook’s ad team initially defended the decision, writing that the photo failed to comply with the social networking site’s “health and fitness policy”.
“Ads may not depict a state of health or body weight as being perfect or extremely undesirable,” Facebook wrote. “Ads like these are not allowed since they make viewers feel bad about themselves. Instead, we recommend using an image of a relevant activity, such as running or riding a bike.”
In a statement on Monday, Facebook apologized for its original stance and said it had determined that the photo does comply with its guidelines.
“Our team processes millions of advertising images each week, and in some instances we incorrectly prohibit ads,” the statement said. “This image does not violate our ad policies. We apologize for the error and have let the advertiser know we are approving their ad.”
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.