An open letter to software engineers criticizing Neil Ferguson’s epidemics simulation code

Konrad Hinsen:

But the main message of this letter is something different: it’s about your role in this story. That’s of course a collective you, not you the individual reading this letter. It’s you, the software engineering community, that is responsible for tools like C++ that look as if they were designed for shooting yourself in the foot. It’s also you, the software engineering community, that has made no effort to warn the non-expert public of the dangers of these tools. Sure, you have been discussing these dangers internally, even a lot. But to outsiders, such as computational scientists looking for implementation tools for their models, these discussions are hard to find and hard to understand. There are lots of tutorials teaching C++ to novices, but I have yet to see a single one that starts with a clear warning about the dangers. You know, the kind of warning that every instruction manual for a microwave oven starts with: don’t use this to dry your dog after a bath. A clear message saying “Unless you are willing to train for many years to become a software engineer yourself, this tool is not for you.”

As a famous member of your community famously said, software is eating the world. That gives you, dear software engineers, a lot of power in modern society. But power comes with responsibility. If you want scientists to construct reliable implementations of models that matter for public health decisions, the best you can do is make good tools for that task, but the very least you must do is put clear warning signs on tools that you do not want scientists to use – always keeping in mind that scientists are not software engineers, and have neither the time nor the motivation to become software engineers.

Do You Have a Theory?

Ben Evans:

The thread through all of this is that we don’t know what will happen, but we do know what could happen – we don’t know the answer, but we can at least ask useful questions. The key challenge to any assertion about what will happen, I think, is to ask ‘well, what would have to change?’ Could this happen, and if it did, would it work? We’re always going to be wrong sometimes, but we can try to be wrong for the right reasons. The point that Pauli was making in the quote I gave at the beginning is that a theory might be right or wrong, but first it has to rise to the level of being a theory at all. So, do you have a theory?

Sunday Services

“If you love me, you will keep my commandments” – John 14

“Love One Another”

“Because I live, you also will live. 20 On that day you will realize that I am in my Father, and you are in me, and I am in you.”

Posted in Uncategorized.

How to think about uni-disciplinary advice

Tyler Cowen:

Let’s say its 1990, and you are proposing an ambitious privatization plan to an Eastern bloc county, and your plan assumes that the enacting government is able to stay on a non-corrupt path the entire time.

While your plan probably is better than communism, it probably is not a very good plan.  A better plan would take sustainability and political realities into account, and indeed many societies did come up with better plans, for instance the Poland plan was better than the Russia plan.

It would not do to announce “I am just an economist, I do not do politics.”  In fact that attitude is fine, but if you hold it you should not be presenting plans to the central government or discussing your plan on TV.  There are plenty of other useful things for you to do.  Or the uni-disciplinary approach still might be a useful academic contribution, but still displaced and to be kept away from the hands of decision-makers.

Nor would it do to claim “I am just an economist.  The politicians have to figure the rest out.”  They cannot figure the rest out in most cases.  Either stand by your proposed plan or don’t do it.  It is indeed a proposal of some sort, even if you package it with some phony distancing language.

Instead, you should try to blend together the needed disciplines as best you can, consulting others when necessary, an offer the best plan you can, namely the best plan all things considered.

Why have consumer drones vanished from DJI’s Hong Kong stores?

Kate Chiu:

The latest Mavic Air 2 and other consumer drones have been missing from DJI’s official stores in Hong Kong, the city where the company began

Before the rest of the world got its first taste of the Phantom or the Mavic Pro, Hong Kong was home to the team that went on to create some of DJI’s biggest hits. Founded in 2006 by a model plane enthusiast studying at a university in the city, DJI spent years developing the components behind its popular drones with the help of a local professor.

But now, as DJI prepares to ship its latest drone, the Mavic Air 2 is notably missing from Hong Kong’s official DJI stores. While it’s available for purchase elsewhere, such as Taiwan and Japan, it can’t be found in the company’s glimmering three-story flagship store in downtown Hong Kong or in the official online shop, which currently shows all consumer drones listed as out of stock. A link posted in the DJI Facebook page along with a Mavic Air 2 promo video leads to a 404 error page.

“You sure it’s purchasable in Hong Kong? You don’t even have [last year’s] Mavic Mini!” one person commented on the post. “Have you considered how long the Hong Kong store has been running out of drones?”

Former Apple Engineer: Here’s Why I Trust Apple’s COVID-19 Notification Proposal

David Shayer:

I also wrote iPhone apps for a mid-size technology company that shall remain nameless. You’ve likely heard of it, though, and it has several thousand employees and several billion dollars in revenue. Call it TechCo, in part because its approach to user privacy is unfortunately all too common in the industry. It cared much less about user privacy than Apple.
The app I worked on recorded every user interaction and reported that data back to a central server. Every time you performed some action, the app captured what screen you were on and what button you tapped. There was no attempt to minimize the data being captured, nor to anonymize it. Every record sent back included the user’s IP address, username, real name, language and region, timestamp, iPhone model, and lots more.
Keep in mind that this behavior was in no way malicious. The company’s goal wasn’t to surveil their users. Instead, the marketing department just wanted to know what features were most popular and how they were used. Most important, the marketers wanted to know where people fell out of the “funnel.”
When you buy something online, the purchase process is called a funnel. First, you look at a product, say a pair of sneakers. You add the sneakers to your shopping cart and click the buy button. Then you enter your name, address, and credit card, and finally, you click Purchase.
At every stage of the process, people fall out. They decide they don’t really want to spend $100 on new sneakers, or their kids run in to show them something, or their spouse tells them that dinner is ready. Whatever the reason, they forget about the sneakers and never complete the purchase. It’s called a funnel because it narrows like a funnel, with fewer people successfully progressing through each stage to the end.
Companies spend a lot of time figuring out why people fall out at each stage in the funnel. Reducing the number of stages reduces how many opportunities there are to fall out. For instance, remembering your name and address from a previous order and auto-filling it means you don’t have to re-enter that information, which reduces the chance that you’ll fall out of the process at that point. The ultimate reduction is Amazon’s patented 1-Click ordering. Click a single button, and those sneakers are on their way to you.
TechCo’s marketing department wanted more data on why people fell out of the funnel, which they would then use to tune the funnel and sell more product. Unfortunately, they never thought about user privacy as they collected this data.
Most of the data wasn’t collected by code that we wrote ourselves, but by third-party libraries we added to our app. Google Firebase is the most popular library for collecting user data, but there are dozens of others. We had a half-dozen of these libraries in our app. Even though they provided roughly similar features, each collected some unique piece of data that marketing wanted, so we had to add it.

Yes, China’s internet is strictly policed, but it’s also a place for weirdness, subversion, and the occasional glimpse of freedom.

Mara Hvistendahl:

The story is a familiar one by now: When a mysterious virus cropped up in the Chinese city of Wuhan in December 2019, a 33-year-old opthamologist named Li Wenliang took to WeChat to sound the alarm. “7 cases of SARS have been confirmed in the Huanan fruit and seafood market,” he wrote in a private message to a group of his medical school classmates. “They were isolated in the emergency department of our Houhu District hospital.”

Someone posted Li’s messages online. Soon afterward, local police reprimanded Li for spreading rumors and forced him to apologize. But their efforts to muzzle him backfired. Li eventually contracted the virus. On January 30, 2020, as his condition worsened, he posted publicly about his run-in with the authorities on the Twitter-like platform Weibo. What happened next reveals a great deal about the dynamics of state control and popular dissent on China’s internet.

The metaphor most often used by Western observers for the Chinese internet is a wall. The slew of controls enacted by the state to regulate internet traffic is the “Great Firewall,” and using a VPN or other tool to circumvent these controls is called pa qiang, or “climbing the wall.” But this metaphor tends to obscure what is happening on the other side of the barrier. There we find people who respond to state controls with creativity and spunk. While some spend their days trawling cat videos, others create oases of subversion within the reality that they’ve been dealt.

Facebook is quietly helping to set up a new pro-tech advocacy group to battle Washington

Tony Romm:

Facebook is working behind the scenes to help launch a new political advocacy group that would combat U.S. lawmakers and regulators trying to rein in the tech industry, escalating Silicon Valley’s war with Washington at a moment when government officials are threatening to break up large companies.

The organization is called American Edge, and it aims through a barrage of advertising and other political spending to convince policymakers that Silicon Valley is essential to the U.S. economy and the future of free speech, according to three people familiar with the matter as well as documents reviewed by The Washington Post. The people spoke on the condition of anonymity to describe the group because it hasn’t officially been announced.

In December, American Edge formed as a nonprofit organization, and last month, it registered an accompanying foundation, according to incorporation documents filed in Virginia. The setup essentially allows it to navigate a thicket of tax laws in such a way that it can raise money, and blitz the airwaves with ads, without the obligation of disclosing all of its donors. Many powerful political actors — including the National Rifle Association — similarly operate with the aid of “social welfare” groups.