THE macroeconomic discussions that Apple’s success prompts tend to be very curious things. Here we have a company that’s been phenomenally successful, making products people love and directly creating nearly 50,000 American jobs in doing so, criticised for not locating its manufacturing operations in America, even as Americans complain to Apple about the working conditions of those doing the manufacture abroad: life in dormitories, 12-hour shifts 6 days a week, and low pay. It isn’t enough for Apple to have changed the world with its innovative consumer electronics. It must also rebuild American manufacturing, and not just any manufacturing: the manufacturing of decades ago when reasonable hours and high wages were the norm.
The utility of Apple, however, is that it does provide a framework within which we can discuss the significant changes that have occurred across the global economy in recent decades. Contributing to that effort is a very nice and much talked about piece from the New York Times, which asks simply why it is that Apple’s manufacturing is located in Asia.
Category: Education
Lunch with the FT: Zbigniew Brzezinski
“Americans don’t learn about the world, they don’t study world history, other than American history in a very one-sided fashion, and they don’t study geography,” Brzezinski says. “In that context of widespread ignorance, the ongoing and deliberately fanned fear about the outside world, which is connected with this grandiose war on jihadi terrorism, makes the American public extremely susceptible to extremist appeals.” But surely most Americans are tired of overseas adventures, I say. “There is more scepticism,” Brzezinski concedes. “But the susceptibility to demagoguery is still there.”
Information Is Cheap, Meaning Is Expensive
The European: A computer “is a simple mind having a will but capable of only two ideas”, you have said.Does it make sense to think of a technical apparatus in biological terms?
Dyson: The quote comes from an illustration of a circuit diagram that Lewis Fry Richardson produced in 1930. It was a very prophetic idea, like most of the stuff that Richardson did. He had drawn this diagram of an indeterminate circuit, so it was impossible to predict which state the circuit would be in. Maybe those are the origins of mind: A simple and indeterminate circuit. The significance of Richardson’s idea was that he broke with the assumption that computation had to be deterministic, because so few others things in the universe are deterministic. Alan Turing was very explicit that computers will never be intelligent unless they are allowed to make mistakes. The human mind is not deterministic, it is not flawless. So why would we want computers to be flawless?
The European: The ultimate indeterminate process on Earth is evolution. Yet evolution doesn’t really require input and commands, it sustains and develops itself. That seems fundamentally different from the way we think about technological evolution…
Dyson: Biological evolution is a bottom-up process. There are differences between the two realms, but there are also similarities: In both biology and technology, things develop into structures of increasing complexity. That’s what Nils AallBarricelli saw right away. He tried to understand the origins of the genetic code and apply that to the development of computers. The question was whether you could run computer experiments that allowed increases in systemic complexity to happen. And very quickly that stopped being an experiment and codes began evolving in the wild—not by random mutation, but by crossing and symbiosis, exactly as Barricelli prescribed.
People are biased against creative ideas, studies find
The next time your great idea at work elicits silence or eye rolls, you might just pity those co-workers. Fresh research indicates they don’t even know what a creative idea looks like and that creativity, hailed as a positive change agent, actually makes people squirm.
“How is it that people say they want creativity but in reality often reject it?” said Jack Goncalo, ILR School assistant professor of organizational behavior and co-author of research to be published in an upcoming issue of the journal Psychological Science. The paper reports on two 2010 experiments at the University of Pennsylvania involving more than 200 people.
The Scourge of the Faith-Based Paper Dollar: Jim Grant foresees a new American gold standard despite Wall Street’s stake in monetary chaos
Jim Grant’s father pursued a varied career, including studying the timpani. He even played for a while with the Pittsburgh Symphony. But the day came when he rethought his career choice. “For the Flying Dutchman overture,” says his son, “they had him cranking a wind machine.”
The younger Mr. Grant, who can be sardonic about his own chosen profession, might say he’s spent the past 28 years cranking a wind machine, though it would be a grossly unjust characterization. Mr. Grant is founder and writer of Grant’s Interest Rate Observer, perhaps the most iconic of the Wall Street newsletters. He is also one of Wall Street’s strongest advocates of the gold standard, knowing full well it would take away much of Wall Street’s fun.
You might say that, as a journalist and historian of finance, he has been in training his whole life for times like ours—in which the monetary disorders he has so astutely chronicled are reaching a crescendo. The abiding interest of Grant’s, both man and newsletter, has been the question of value, and how to know it. “Kids today talk about beer goggles—an especially sympathetic state of perception with regard to a member of the opposite sex,” he says of our current market environment. “We collectively wear interest-rate goggles because we see market values through the prism of zero-percent funding costs. Everything is distorted.”
How to survive the age of distraction
Read a book with your laptop thrumming. It can feel like trying to read in the middle of a party where everyone is shouting
In the 20th century, all the nightmare-novels of the future imagined that books would be burnt. In the 21st century, our dystopias imagine a world where books are forgotten. To pluck just one, Gary Steynghart’s novel Super Sad True Love Story describes a world where everybody is obsessed with their electronic Apparat – an even more omnivorous i-Phone with a flickering stream of shopping and reality shows and porn – and have somehow come to believe that the few remaining unread paper books let off a rank smell. The book on the book, it suggests, is closing.
I have been thinking about this because I recently moved flat, which for me meant boxing and heaving several Everests of books, accumulated obsessively since I was a kid. Ask me to throw away a book, and I begin shaking like Meryl Streep in Sophie’s Choice and insist that I just couldn’t bear to part company with it, no matter how unlikely it is I will ever read (say) a 1,000-page biography of little-known Portuguese dictator Antonio Salazar. As I stacked my books high, and watched my friends get buried in landslides of novels or avalanches of polemics, it struck me that this scene might be incomprehensible a generation from now. Yes, a few specialists still haul their vinyl collections from house to house, but the rest of us have migrated happily to MP3s, and regard such people as slightly odd. Does it matter? What was really lost?
Is Facebook geared to dullards?
Are you ashamed that you find Facebook boring? Are you angst-ridden by your weak social-networking skills? Do you look with envy on those whose friend-count dwarfs your own? Buck up, my friend. The traits you consider signs of failure may actually be marks of intellectual vigor, according to a new study appearing in the May issue of Computers in Human Behavior.
The study, by Bu Zhong and Marie Hardin at Penn State and Tao Sun at the University of Vermont, is one of the first to examine the personalities of social networkers. The researchers looked in particular at connections between social-network use and the personality trait that psychologists refer to as “need for cognition,” or NFC. NFC, as Professor Zhong explained in an email to me, “is a recognized indicator for deep or shallow thinking.” People who like to challenge their minds have high NFC, while those who avoid deep thinking have low NFC. Whereas, according to the authors, “high NFC individuals possess an intrinsic motivation to think, having a natural motivation to seek knowledge,” those with low NFC don’t like to grapple with complexity and tend to content themselves with superficial assessments, particularly when faced with difficult intellectual challenges.
The researchers surveyed 436 college students during 2010. Each participant completed a standard psychological assessment measuring NFC as well as a questionnaire measuring social network use. (Given what we know about college students’ social networking in 2010, it can be assumed that the bulk of the activity consisted of Facebook use.) The study revealed a significant negative correlation between social network site (SNS) activity and NFC scores. “The key finding,” the authors write, “is that NFC played an important role in SNS use. Specifically, high NFC individuals tended to use SNS less often than low NFC people, suggesting that effortful thinking may be associated with less social networking among young people.” Moreover, “high NFC participants were significantly less likely to add new friends to their SNS accounts than low or medium NFC individuals.”
To put it in layman’s terms, the study suggests that if you want to be a big success on Facebook, it helps to be a dullard.
2011: And Still No Energy Policy
“First generation [corn] ethanol I think was a mistake. The energy conversion ratios are at best very small.”
– Al Gore, speaking at a Green Energy Conference on November 22, 2010
“Ethanol is not an ideal transportation fuel. The future of transportation fuels shouldn’t involve ethanol.”
– Secretary of Energy Steven Chu, November 29, 2010
No one knows what brought on the blast of political honesty in the last eight days of November. Having been a rabid ethanol booster for most of his political career, there was former Vice President Al Gore reversing course and apologizing for supporting ethanol. Of course Gore’s reason for taking that position was perfectly understandable — for a politician. As he told the Athens energy conference attendees, “One of the reasons I made that mistake is that I paid particular attention to the farmers in my home state of Tennessee, and I had a certain fondness for the farmers of Iowa because I was about to run for President.”
Translated from politics-speak into English, pandering to farmers gets votes. But if your claimed position is to plan some sort of energy policy for everyone else, then getting farmers’ votes shouldn’t determine what’s the right thing to do for the nation’s fuel supplies.
What Did We Do, Pre iPhone; Part II
I talked with an iPhone owner during a recent conference. While she was tapping away on email and a variety of apps, she mentioned “I don’t know what I did before….”.
Changing everything, including education.
Digital Maoism
FROM “Wikinomics” to “Cognitive Surplus” to “Crowdsourcing”, there is no shortage of books lauding the “Web 2.0” era and celebrating the online collaboration, interaction and sharing that it makes possible. Today anyone can publish a blog or put a video on YouTube, and thousands of online volunteers can collectively produce an operating system like Linux or an encyclopedia like Wikipedia. Isn’t that great?
No, says Jaron Lanier, a technologist, musician and polymath who is best known for his pioneering work in the field of virtual reality. His book, “You Are Not A Gadget: A Manifesto”, published earlier this year, is a provocative attack on many of the internet’s sacred cows. Mr Lanier lays into the Web 2.0 culture, arguing that what passes for creativity today is really just endlessly rehashed content and that the “fake friendship” of social networks “is just bait laid by the lords of the clouds to lure hypothetical advertisers”. For Mr Lanier there is no wisdom of crowds, only a cruel mob. “Anonymous blog comments, vapid video pranks and lightweight mash-ups may seem trivial and harmless,” he writes, “but as a whole, this widespread practice of fragmentary, impersonal communication has demeaned personal interaction.”
If this criticism of Google, Facebook, Twitter and Wikipedia had come from an outsider—a dyed-in-the-wool technophobe—then nobody would have paid much attention. But Mr Lanier’s denunciation of internet groupthink as “digital Maoism” carries more weight because of his career at technology’s cutting edge.