COMPUTATIONAL THINKING and CYBERLEARNING
12.7K views | +7 today
Follow
COMPUTATIONAL THINKING and CYBERLEARNING
Supercomputing requires math, thinking skills, algebra and computational thinking and an awareness of gateways to computing. New technologies require rethinking the use of technology Cyberlearning does that for transformational learning.
Your new post is loading...
Your new post is loading...
Rescooped by Bonnie Bracey Sutton from Creative teaching and learning
Scoop.it!

Should every school class be a computer coding class? - The Hechinger Report

Should every school class be a computer coding class? - The Hechinger Report | COMPUTATIONAL THINKING and CYBERLEARNING | Scoop.it
This spring, at St. Anne’s-Belfield School in Charlottesville, Virginia, the fifth-grade Spanish class programmed computers to produce bilingual, animated photo albums. The seventh-grade science class rejiggered the code behind climate models.

Via Leona Ungerer
more...
No comment yet.
Rescooped by Bonnie Bracey Sutton from Tracking the Future
Scoop.it!

3rd Annual Seymour Benzer Lecture - Aliens, computers and the bio-economy - An introduction to synthetic biology

Our capacity to partner with biology to make useful things is limited by the tools that we can use to specify, design, prototype, test, and analyze natural or engineered biological systems. However, biology has typically been engaged as a "technology of last resort" in attempts to solve problems that other more mature technologies cannot. This lecture will examine some recent progress on virus genome redesign and hidden DNA messages from outer space, building living data storage, logic, and communication systems, and how simple but old and nearly forgotten engineering ideas are helping make biology easier to engineer.


Via Szabolcs Kósa
more...
No comment yet.
Rescooped by Bonnie Bracey Sutton from Tracking the Future
Scoop.it!

Brainlike Computers, Learning From Experience

Brainlike Computers, Learning From Experience | COMPUTATIONAL THINKING and CYBERLEARNING | Scoop.it

Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head.


Via Szabolcs Kósa
more...
VendorFit's curator insight, December 31, 2013 3:27 PM

Artificial intelligence is the holy grail of technological achievment, creating an entity that can learn from its own mistakes and can (independently of programmer intervention) develop new routines and programs.  The New York Times claims that the first ever "learning" computer chip is to be released in 2014, an innovation that has profound consequences for the tech market.  When these devices become cheaper, this should allow for robotics and device manufacture that incorporates more detailed sensory input and can parse real objects, like faces, from background noise. 

Laura E. Mirian, PhD's curator insight, January 10, 2014 1:16 PM

The Singularity is not far away

Rescooped by Bonnie Bracey Sutton from Computational Tinkering
Scoop.it!

Google Unveils Software That Battles Deforestation

Google Unveils Software That Battles Deforestation | COMPUTATIONAL THINKING and CYBERLEARNING | Scoop.it

Google seems to have its hand in virtually everything, from ocean mapping to home energy monitors. Now it tackles deforestation.


Via Anne Caspari, Susan Einhorn
more...
No comment yet.