There are many obstacles for students to learn computer science. Whether or not one believes that everyone can be a programmer, the fact remains that 90 percent of high school students never even have the opportunity to learn.
Stephen Wolfram ensmartens all the things ZDNet Wolfram is the chief designer of Mathematica, a comprehensive computation platform for science, engineering, advanced mathematics and other grunty stuff, and of Wolfram Alpha, the computational...
Teachers lacking the confidence to teach pupils computer programming skills need to change the way they approach their subject, a pioneering educator has said.
Rather than formally teaching programming, he says he provides the right environment for pupils to explore the subject themselves. He admits he could not possibly teach children everything they need to know about computing.
Computational knowledge. Symbolic programming. Algorithm automation. Dynamic interactivity. Natural language. Computable documents. The cloud. Connected devices. Symbolic ontology. Algorithm discovery. These are all things we’ve been energetically working on—mostly for years—in the context of Wolfram|Alpha, Mathematica, CDF and so on.
But recently something amazing has happened. We’ve figured out how to take all these threads, and all the technology we’ve built, to create something at a whole different level.
With all the discussion of the skill set and mindset necessary for journalists today — both of which are important — we need to also consider a deeper question about mindset: how to go about rethinking our thinking. One key area of exploration is computational thinking, through which we can tie the practice of journalism to the digital technology at the heart of new publication and distribution systems.
Learning to program computers can bring unique insights to other fields for both pupils and teachers – Miles Berry on how computational thinking can revolutionise the way we teach and learn (The case for agile pedagogy
A growing movement where nonprofit organizations are using computers, mobile devices, the internet, and even digital currencies to help improve the plight of the country’s poor and homeless, nudging them towards new jobs, new homes, regular healthcare.
In announcing his Internet.org initiative, Facebook CEO Mark Zuckerberg painted internet access as a basic human right. Many people have laughed off this suggestion, but if you spend time with Darrell Pugh and others like him, you realize it isn’t that far from the truth.
For Pugh, tech education works on another level: It helps people feel good about themselves. “Sometimes, the people that are poor or homeless, they don’t get the technology,” Pugh says. “Once you get an eyes-on or a hands-on, it changes how you look at things. You may not understand the physics of it, the engineering of it, but it’s not magic anymore. That helps people step forward.”
Susan Einhorn's insight:
Eyes-in, hands-on, it changes how you look at things. And how you look at yourself.
Last week, the man himself, Fred Wilson published a blog post about the importance of learning to code.
His decree was simple and straightforward: If You Aren't Technical, Get Technical. In particular, if you are a non-technical co-founder at a tech company...you must get technical.
But I think that Mr. Wilson is making the process of learning to code seem easier than it is, and I think its worth addressing the very real, but totally overcome-able, challenges of transitioning from non-techie to techie.
Coding isn't impossibly hard. It's just hard in the way that anything worth having is hard.
And if you break it down into smaller steps, it's downright... manageable :)
For everyone who has just needed someone to help them...
It is SO MUCH easier to learn to code when you have a community of people, whether instructors, fellow students, or friends, to help support and guide you.
Susan Einhorn's insight:
As Seymour Papert would say, learning to code is "hard fun".
In 01970 John Conway developed a computer program called The Game of Life. The idea behind it was that the process of biological life is, despite its apparent complexity, reduceable to a finite set of rules. The game is made up of a grid of squares, or “cells,” in one of two states: “alive” or “dead.” A player sets the initial conditions, choosing which squares should be alive. Each turn of the game is like a generation – some squares live, some are born, and some die. Just a few simple rules determine cells’ behavior: Cells with too few or too many neighbors die. Empty squares with the right number of neighbors come to life.
This simple set-up, played out by the computer over many generations creates vast complexity that is hard to watch without thinking of life. And, in fact, in 02010 a structure was created within the game by Andrew Wade capable of reproducing itself, much like the real-life molecules that eventually lead to all the living creatures on earth.
By elaborating the rules governing the simulation, could other life-like processes also be modelled? Ecologist Peter Turchin has done just that. He wanted to understand how human societies, not just single organisms, grow and disperse. So, he created a computer simulation not unlike The Game of Life. It’s got a lot more rules, but the basics are similar:
Here we go again. Time after time, school districts have placed great hope on technology to be the silver bullet and great equalizer. From Apple II computers in the 80s and 90s to the "tabletification" of today, education leaders have invested precious resources (estimated to be over $50 billion in 2012) in technology so that students will be "prepared for the future," equipped as "digital citizens" and "21st century learners" who are empowered through technology. But, time and time again, the same reality hits: It is not about just having technology in the classrooms, it is what students do with it and learn from it that matters.
In 2000 we conducted a research study focusing on computer science in schools. A more detailed account can be read in Stuck in the Shallow End: Education, Race, and Computing(MIT Press, 2008). We found that once again the wrong indicators for success were being watched. For instance, as Digital High Schools were granted huge investments for computers and the supporting infrastructure, what was called "computer science" in those schools was often just glorified typing and low-level activities. Most distressing, it was rare, if ever, for college-preparatory computing courses to exist in schools with high numbers of African American and Latino students.
Schools must stop being "technology rich, but curriculum poor."
Literacy in Coding is an advantage in this technology-driven economy.
Coding is the new buzz language of today’s tech-savvy world. No matter what the occupation is, it would surely coincide with using technology, and those who know how to code, which is the basis of computer science programming language, would surely be at an advantage.
Our schools now teach students to use ICT as consumers and not as programmers, whereas the rest of the developed world is abuzz in the development of technological innovations by encouraging and building literacy befitting 21st century living.
Susan Einhorn's insight:
From Malaysia, but very fitting for all countries.
According to Dr. Thom Dunning from University of Illinois at Urbana-Champaign, science and engineering research has been revolutionized by computation but, to date computing has largely been used to organize, prepare for and disseminate courses. Dunning says that “The potential of using these technologies to teach students the fundamental principles of a subject through authentic computational simulation is largely unexplored.”
According to Dunning, "
The overarching thesis I have is, as computational simulation has revolutionized scientific and engineering research, it can also revolutionize scientific and engineering education.
It’s very clear that computing and particular simulations had a major impact on most fields of science. It adds a new way for us to understand how the world operates, whether it’s the chemical world, the physical world, the biological world or some combination of all of them, which is of course what happens with global climate simulation. And we really have to think about how it we effectively use those kinds of simulations in education."
IBM has built a computational creativity machine that creates entirely new and useful stuff from its knowledge of existing stuff. And the secret sauce in all this? Big data, say the computer scientists behind it.
Lav Varshney and pals at IBM’s T J Watson Research Center in Yorktown Heights are using Watson to tackle the problem of computational creativity. They revealed some aspects of the work to the press last month and have now published more on the arXiv.
Their first problem of course is to define creativity. “Creativity is the generation of a product that is judged to be novel and also to be appropriate, useful, or valuable by a suitably knowledgeable social group,” say Varshney and pals.
So a key factor in their work is that creativity is entirely subjective and so requires detailed feedback from human experts. “A computational creativity system has no meaning in a closed universe devoid of people,” they say.
What’s more, this definition implies that creativity is a process that in principle can be automated.
Over the decades, students have been required to take a foreign language in high school for reasons that relate to expanding communication abilities, furthering global awareness, and enhancing perspective-taking. Recently, our home state of Texas passed legislation that enables computer science to fulfill the high school foreign language requirement. Coding (defined byBusinessDictionary.com as "the process of developing and implementing various sets of instructions to enable a computer to do a certain task") is, after all, both a language and a foreign subject to many students -- and much mor
Susan Einhorn's insight:
Computer languages not only have interesting vocabulary patterns, they have a syntax like any spoken/written language. The more you use any language, including a computer language, the more fluent you become and the better you can express new ideas in that language. So it takes not only having 'language' instruction, but enough time to use the language and immerse learners in it to gain the fluency that's so valuable.
Computer science education is growing increasingly more important--not just in schools, but in the nation's economy.
Forty of 50 states do not count computer science toward math or science requirements for high school graduation, and only 1 in 10 schools offer computer programming classes. That could change, though, if more states make efforts similar to those in Washington.
Recent data indicate that only 35 of the state’s 622 high schools offer AP computer science.
Computer science is an incredibly promising major, especially for a young woman. That and engineering are among the college degrees that can offer the highest incomes and the most flexibility — attributes widely cited for drawing many women into formerly male-dominated fields like medicine. Writing code and designing networks are also a lot more portable than nursing, teaching and other traditional pink-collar occupations. Yet just 0.4 percent of all female college freshmen say they intend to major in computer science. In fact, the share of women in computer science has actually fallen over the years. In 1990-91, about 29 percent of bachelor’s degrees awarded in computer and information sciences went to women; 20 years later, it has plunged to 18 percent. Today, just a quarter of all Americans in computer-related occupations are women.
Ushering in a new era of computing in support of medical research and patient treatment, The University of Texas MD Anderson Cancer Center and IBM jointly announced on Friday that MD Anderson is using the IBM Watson cognitive computing system as a key tool in its mission to eradicate cancer.
Computer systems currently in use have delivered tremendous business and societal benefits by automating tabulation and harnessing computational processing and programming to enhance and amplify enterprise and personal productivity. However, IBM maintains that the computer systems of tomorrow – cognitive systems like Watson — will forever change the way people interact with computing systems to help users extend their expertise across any domain of knowledge and make complex decisions involving extraordinary volumes of fast moving Big Data.
Susan Einhorn's insight:
Computational thinking + + - visions of the (near) future.
Learners today have the amazing potential to produce their own digital content. But one challenge ahead is to counter a culture where users simply consume digital content rather than engage in the creative process to produce their own. We need to be firmly focused on enabling the next generation of digital makers and equipping them with the key skill to support that creativity: Computational Thinking.
A University of Southern California professor and colleagues from Stanford and Harvard universities were awarded theNobel Prize in Chemistry on Wednesday for their pioneering use of computer modeling programs to help predict complex chemical reactions.
Their work, which began in the 1970s, has revolutionized chemistry research, where scientists now work with computers as much as they do with test tubes.
“Chemical reactions occur at lightening speed,” read an announcement from the Royal Swedish Academy of Sciences in Stockholm. “In a fraction of a millisecond, electrons jump from one atomic nucleus to the other. Classical chemistry has a hard time keeping up. … Aided by the methods now awarded with the Nobel Prize in Chemistry, scientists let computers unveil chemical processes.”
Walking through the gallery with Mathews, CTO and vice-president of the reality capture group at Autodesk, one senses that it's abundantly clear that this future is just, per William Gibson, awaiting its even distribution. "If complexity is free," Mathews asks, "where does complexity come from? That's what design is all about." In other words, if everything that can be rendered in bits can begenerated, a world awash in "physibles" -- as these voxels (the volumetric pixels that make up the smallest visible elements in a 3D design) of material are known -- what is the role of design, the process of deciding how those bits should be arranged?
"This is a new design paradigm," Tibbits says. "It's just not designing and making things. It's designing things that change over time, and so how we incorporate that programmability and changeability into design tools is a really big question."
"More and more design is turning away from what you've already designed up here," Kowalski says, gesturing to his head, "and changing into a conversation with the computer. Nobody drafts any more -- everyone creates a higher-level model that expresses design intent. All that stuff we thought of as blueprints are inconsequential outputs of the overall process."
Susan Einhorn's insight:
This is really what we should mean when we use the term 21st century thinking.
Handwork and technology might seem at first glance to be at odds. But there's a case to be made that handwork and computing -- and the kind of process that links the two -- are more closely related than one might think.
When electrical engineering professor Dr. Karen Shoop of Queen Mary University in London took her first knitting workshop, she noticed immediately that knitting is very similar to writing computer code. “I noticed that knitting instructions are largely binary (like computers) – in other words, knit or purl,” she said. “More interesting were the knitting instructions, which read just like regular expressions [of code], used for string matching and manipulation when coding.” Shoop also recognizes that the earliest stages of computing were inspired by handwork: “Of course, computers ultimately started off partially inspired by weaving and the Jacquard loom, or earlier Bouchon’s loom. Arguably some of the earliest programmers were the people making the card/paper punch hole patterns for weaving patterns.”