Wednesday, May 28

Edward Snowden and the House of the Internet's toxic mold problem

"Mold needs a couple of things to grow. It needs water, it needs cellulose. Everything we build our homes out of, almost, is cellulose-based."
-- Attorney Alex Robertson, specialist in toxic mold cases, commenting in 2001 on the toxic mold in Erin Brockovich's house

Mike Hayden, a former NSA director, has complained vociferously that Edward Snowden didn't just steal copies of stolen classified material as other leakers have done; he revealed what Hayden called "the plumbing" -- files that show how the NSA data collection system works, the NSA's methods.

Darn tootin' Snowden revealed the plumbing. That's what a good house inspector is supposed to do.  The NSA's methods depend for their success on vulnerabilities that were deliberately built into the Internet by its American creators; add the fact that as with cellulose the Internet is highly permeable.  It's that last part that the builders of the House of the Internet didn't take into consideration. They were thinking in terms of trap doors.  As to what would happen when the doors, which were themselves highly permeable, got waterlogged from leaks.... 

What happened is that it's not only the 'good guys' who can exploit the vulnerabilities; it's also the toxic types.  And so the inbuilt vulnerabilities have cost companies and even the U.S. government untold billions in dollars from cybercrime, and put the entire world -- to include the U.S. military -- under serious threat of cyberwar.

The vulnerabilities have also made privacy a joke for individual Internet users and made fools out of sophisticated cyber prophets such as Mark Pesce who envision the Internet replacing brick and mortar government.  But how could the prophets have known about the Internet's true vulnerabilities, until Snowden got into the basement of the House of the Internet and began inspecting the plumbing?

So it's not only the "Stasi State" that Snowden's saving us from, as Daniel Ellsberg wrote last year.  He's saving the entire modern era.  Any doubts, read the book of the year and maybe the book of the decade: Cybersecurity and Cyberwar by two academics, P.W. (Peter Warren) Singer and Allan Friedman, based at Brookings Institution. 

The book, published this April by Oxford University Press for its "What Everyone Needs to Know" series, had to meet Oxford's high bar for the series: satisfy academia while being accessible to a reasonably literate general reader.

Singer and Friedman cleared the bar. But by the time they're finished explaining how this modern era in communications has shaped up, it's painfully clear that Snowden wasn't kidding when he said that he was still working for the NSA only they were the last people to know -- although I assume he meant the last people in the cybersecurity field.
The situation reminds me of what happened to Erin Brockovich. She bought the million-dollar house of her dreams with proceeds she famously won from exposing that a power company was knowingly contaminating residential groundwater with a toxic chemical. Her McMansion turned out to be so toxic it was unlivable.

But while Erin and her family kept getting sicker and sicker over the course of the year after they moved in, they didn't make a connection between this and the house.  It was a contractor she'd brought in to fix a leak, one who could chew and walk, who clued her that behind the lovely walls of her lovely house was very toxic mold. 

She refused to be run off by a mold, but the cleanup price was $600,000.  And she sued everyone in sight -- the builder, the subcontractors and the former owner -- on the grounds that it was faulty construction that had caused the water leaks that led to the mold.  Yet given what she went through with the power company, one would think that she would have found a house inspector who left no stone unturned before she signed the papers.  Obviously, she didn't. 

Just so, the world's Internet users, including the U.S. military, have paid a very high price for not closely inspecting a communications system that has more holes than Swiss cheese, every one of which can be readily turned against them.

Of course nobody planned it that way. It just happened, when within a decade hundreds of millions of individuals, and every government entity from federal to municipal, and every major private corporation, piled onto a system that had never been given a thorough inspection.
As for the military's use of the Internet -- it wasn't as if they didn't know about the vulnerabilities; it's just that no one could have projected how the vulnerabilities would combine in a system that wasn't built to hold up the entire modern era! 

The consequences are turning out to be worse than anyone imagined. Just one consequence, from the Washington Post's illuminating  2013 profile of Keith Alexander:
He has been credited as a key supporter of the development of Stuxnet, the computer worm that infected Iran’s main uranium enrichment facility in 2009 and 2010 and is the most aggressive known use to date of offensive cyberweaponry. U.S. officials have never publicly acknowledged involvement in what has been described by experts as the first known, industrial-scale cyber attack on a sovereign nation, one that is estimated to have set back Iran’s uranium production by as much as a year.

Alexander also pushed hard for expanded authority to see into U.S. private sector networks to help defend them against foreign cyberattacks.

Quiet concerns also have been voiced by some of the private companies that would potentially benefit from government protection against cyberattack.

At a private meeting with financial industry officials a few years ago, Alexander spoke about the proliferation of computer malware aimed at siphoning data from networks, including those of banks. The meeting was described by a participant who spoke on the condition of anonymity because the discussion was off the record.

His proposed solution: Private companies should give the government access to their networks so it could screen out the harmful software. The NSA chief was offering to serve as an all-knowing virus-protection service, but at the cost, industry officials felt, of an unprecedented intrusion into the financial institutions’ databases.

The group of financial industry officials, sitting around a table at the Office of the Director of National Intelligence, were stunned, immediately grasping the privacy implications of what Alexander was politely but urgently suggesting. As a group, they demurred. [...]
Alexander hasn't given up, by the way.  In his farewell testimony in February to the Senate Armed Services Committee, he was still pushing to make NSA the guardian of American financial security:
But while he appeared to soften his position on bulk domestic surveillance on Thursday, Alexander also implored Congress to pass legislation that would expand the authority of the NSA and its twin-sister military organization, Cyber Command, to protect private and business networks from online data theft and cyber attacks.

“We need to have a classified relationship” with major businesses to aid their ability to secure their data, Alexander said. That relationship is currently the responsibility of the Department of Homeland Security; Alexander said he would meet with the new DHS secretary, Jeh Johnson, over the next several weeks.
Do you see what's happened?  The military was still thinking of cyberwar as an air and ground offensive: you attack, then pull your tanks back and put your planes in the air to protect your flanks.

It doesn't work that way in a world in which there is no real ground and sky. 

So now Alexander and everyone else in U.S. government security is running around, trying to guard every gate. Can't be done in a highly permeable world with no real gates.
Short of practically razing the Internet to the ground and rebuilding it, just how are they going to secure it from devastating cyber attacks?  Companies here in the USA and the world over are now scrambling to devise security patches to the Internet. The cyber security industry has grown by leaps and bounds since Snowden revealed that the House of the Internet is full of holes and places toxic mold can grow -- places that are hard to detect unless you get into the basement and know what you're doing while you're mucking around in there. 

That's also what the somewhat unfortunately named Trustycon conference was about in February:
TrustyCon” – short for the Trustworthy Technology Conference – came together in a hurry after Mikko Hypponen, chief research officer for F-Secure, a Finnish security company, announced in January, in a public letter to RSA, that he was canceling his scheduled RSA conference talk and that his own company would skip the event entirely.

Hypponen, a rock star in the computer security world, gave the opening keynote at TrustyCon instead. It was a pessimistic assessment of technology users’ chances to have a computing and communications they can genuinely trust in an age when nation-states have taken over as the most dangerous – even malicious – hackers on Earth.

“Our worst fears turned out to be fairly accurate,” Hypponen said of what’s transpired in the security world over the past few years.
Hypponen's letter to RSA is linked to in the report.  And read the entire report to understand why he pulled out of the RSA conference.

Governments are also scrambling.  The German government wanted to leave the Internet, build its own, it was so spooked by Snowden's revelations. He told them during an interview on German public TV that it wouldn't work. Then, like a good house inspector he explained what it would take to be secure:
"The NSA goes where the data is. If the NSA can pull text messages out of telecommunications networks in China, they can probably manage to get Facebook messages out of Germany. The solution to that is not to set everything in a walled garden.  It’s much better to secure the data internationally rather than playing, ‘let’s move the data’. Moving the data isn’t fixing the problem, securing the data is the problem.”
All governments better start listening to Snowden and doing what he advises; the first government that should listen is the U.S. one.  Snowden's enemies need to can the crap about his being a spy, or being a tool of the Russian government, or the FSB pulling secrets out of him, or whatever the latest accusation.  Another fallback position is that he was just a hacker.  So then Snowden told NBC's Brian Williams that actually no, he'd been a spy -- a trained U.S. government spy. The interview is being aired tonight at 10 PM in its entirety.
(Did I not remark that Putin pegged him as a former spy?  Recall my post about Putin boxing his ears on Russian national TV.)  Putin surely also knew that because Snowden was trained in spycraft, trying to pull secrets out of him would be asking to get sent for a ride on a Trojan Horse.
As to why Snowden came in from the cold -- in a way the news was already out, given what Putin said about him on Russian TV. But one guess is that he got tired of listening to people in Washington and London smear him and for no other reason than CYA.

Yet I would hope that by now the U.S. military knows they have to get this man home and pick his brains about how to clean up the Internet's toxic mold problem.

Deutsche Welle summed it all up in their discussion of Snowden's appearance on German TV:  "Millions of people who never before glanced at the innards of the digital era now strive to learn its arcane terminologies so they can follow the juggernaut's progress."

Yes. Most of us are at the bottom of a high ladder of learning. The crux of the situation is that the holes in the Internet can be made into many juggernauts by all kinds of bad actors who do know the innards.  As the book by Singer and Friedman makes clear, it's going to take a lot of cooperation among governments to deal with the situation.
If nothing else, Snowden's revelations (the bulk of which were made after the book's publication) scared a lot of governments straight -- well, as straight as a government can get. Germany's Parliament has fielded an enormous inquiry that will take at least two years to complete. They want to cover all the bases:
Where are the limitations of international cooperation of German intelligence services? What are Americans allowed to do in Germany and what not? Did German authorities know about the practices of the NSA? Who operates and secures the Internet hubs - the sites that were used to get information? The investigation committee has to tackle all of these questions.
All that and much more. They can't subpoena witnesses from foreign governments but they're hoping to get cooperation from the U.S. and U.K. What the Bundestag has in mind, broadly, is to develop a kind of template that going forward their government and all others can use, to help them determine where the lines are when it comes to clandestine surveillance, and cyber war.
This kind of template should have been made 10 years ago, but now there's the impetus to build it.  The commission has already been turned into a political circus in Germany, although that's to be expected -- and of course Snowden got dragged into the uproar.  But as they slog forward, I hope their effort will attract greater international cooperation.

Yet cooperation is only one part. The other part is the need for brilliant minds that are very knowledgeable about the House of the Internet.  The April Vanity Fair profile of Snowden mentions that the U.S. military has vetted the account that a NSA employee provided to Forbes last year about Snowden's time at the NSA Hawaii branch. The employee stressed that Snowden is a genius, a rare type of genius.  We need to benefit from that rare genius, not try to jail it on a trumped up espionage charge.

Singer and Friedman point out that in a key respect this era harks to the pre-World War I one:  European governments had gotten control of a host of new technologies, such as the telegraph, which they sought to weaponize.  But the governments and their militaries didn't know how to control how the technologies would work and interact when deployed as weapons. This is because they couldn't imagine many of the outcomes, which had never been set in motion before.

There was no Edward Snowden, no rare genius with a highly integrated understanding of the new technologies, to warn, to explain how much could go wrong and how it could go wrong.  So the governments that led the fighting in World War I had to learn the hard way. It is nearly unbearable to contemplate in detail what the learning process entailed.


CNN, Fox Cable marketing random mass killings as reality TV entertainment. Major TV advertisers such as Walmart, Target, Coca-Cola, Pepsi, McDonald's, Taco Bell, Comcast, Verizon, Microsoft, Ford, Chrysler, are allowing their brands to be associated with the practice

"Rampage killers cast themselves as stars of a public spectacle"
-- Ari N. Schulman, What Mass Killers Want -- And How To Stop Them

The U.S. news media's saturation coverage of random mass killings of Americans, and the way it's glorified and encouraged the killers, has been happening in incremental fashion over a period of years. So it kind of crept up, but now it's at crisis proportions.

And the major television advertisers are not only financing what could be called Mass Murder Reality TV, their iconic brand images are suffering from constant association with saturation TV coverage of the mass killings. There's ten minutes of TV coverage of a killer's family life or graphic images of the murder scene, then switch to a smiling person enjoying a Coca-Cola.

This is very effective rebranding of a product -- effective because it's subliminal.  Decades ago this kind of rebranding was a transient phenomenon because the random mass killings were few and far between, and got nowhere near the TV coverage that they do today.  There are now many such killings, on an almost routine basis, and which receive 24/7 TV coverage, often for several days or even weeks on end. Then turn around and there's another mass killing, and the whole cycle starts up again, frequently punctuated by commercial breaks.   

The irony is that the major corporations that sell directly to the public have sunk a lot of money into research to insure that their brands don't have the wrong association in the consumer's mind.  They're very picky about what kind of TV shows they want associated with their brand. Yet they don't seem to have noticed that their brands are now associated with the worst horrors of modern American life.

I'm going to unpack the situation a little more before I make a suggestion to the corporations who advertise on the news stations.
A New Kind of Mass Killer

While often the massacres are still termed "rampage" killings, studies done by police task forces and forensic psychologists have found that in this era the perpetrators don't display the pathology of the classic rampage killer.

The random mass killer who's emerged during the last couple decades is sane, and not suffering from limbic rage while carrying out the killings. He is methodical and highly organized in his planning and execution of the crime, which he generally plans to end with his own death, either from 'copicide' or suicide by his own hand. 
He also plans the mass murders so they achieve maximum publicity. Often he's in competition with the publicity records of earlier mass killers, which he studies before planning his own crime.

He also attempts to create maximum suffering for the survivors of his massacre because he knows TV reportage will extensively cover this angle.

The killer's focus is on making the spotlight for his killings as large as possible. It is the publicity itself, how great he can make the publicity, that is the main driver of his actions, no matter what rationalizations he uses for the killings.

For more details on the forensic profile of this type of killer and his modus operandi, see Ari Schulman's November 8, 2013, report for the Wall Street Journal, which I've linked to above.  While Schulman doesn't specifically note this, I think it's obvious that the new kind of killer is very much a creature of television news.  Television is the primary news source for most Americans and the most powerful transmitter of images of violent crimes and the survivors' suffering.

So this is literally made-for-television crime.

A New Kind of Television News Reporting

In time with the rising epidemic of the new killer has been the transformation of 24/7 news cable stations into reality television; specifically CNN and FNC (Fox Cable). During little more than a decade these stations have gone from presenting coverage of U.S. national politics as if it's a gladiatorial contest to converting any kind of story that has a sensational angle into a reality TV show.

The reality show is repeatedly aired until the producers can latch onto another news incident -- a fire, an accident, a trial, anything -- and make another reality show out of it, which then gets saturation coverage for days or weeks on end.

In short, they've converted news into reality entertainment, and applied this type of programming to the coverage of random mass killings of Americans.

The entertainment isn't limited to the evening news hours on these stations; every single hour of programming continues the saturation coverage of a killing, which is then re-aired after prime time ends.

Yet such coverage in the wake of a mass killing is not only entertainment. It's also an advertisement for this type of massacre. As such, it goes hand in glove with the mass murderer's desire to have his death command a large audience. I suspect the killings in these cases are collateral damage as stage business, meant to draw the spotlight to the killer's death. 

Such programming is also an infomercial on how mass killings are accomplished.  Prospective mass killers can and do study the infomercial for insights in their planning of a massacre.

And just as the new type of killer is in competition with other killers for TV ratings, MSNBC cable news and the commercial TV networks -- ABC, CBS, NBC -- have increasingly aped CNN and FNC's reality TV programming approach to compete with their coverage. 

The New Killer, Reality News TV, and Terrorism

Al Qaeda and its terrorism franchises also study the infomercials for tips and note the huge amount of publicity the crimes are given on TV news shows.  In fact, random mass killings executed by a 'lone wolf' type are exactly what AQ chief Ayman al Zawahiri has called for at this time as the most effective strategy for terrorists who want to kill Americans.  
Of course, the random mass killings aren't characterized as terrorism. That's because terrorist acts and the killings have different root causes. Yet both have the same devastating effects.  A review of TV footage of reporting on a terrorist attack and a mass killing shows that despite their different causes, both types of incidents have the same effect on the victims, survivors, relatives of both, and the general populace.  There is no difference in the reactions.

Nor is there a difference in the way television news handles the two types of incidents -- reporters ask same kind of questions and treat coverage of the two types of perpetrators in the same way.

Thus, a blurring of mass crime and terrorism mixed with a blurring of shock reality TV and news.  This is what America's major television advertisers are financing when they buy time on U.S. cable and commercial TV news shows.  And this is what they've associated their products with: people who've learned to stretch their 50 seconds of fame on a TV news report to 15 days by carrying out the most gruesome mass crimes they can think up, and in effect turning U.S. TV news stations into their publicity agents!


Ari Schulman tells about a rash of copycat suicides that broke out in Vienna, Austria.  People were jumping onto subway tracks in the path of an oncoming train.  There were more and more jumpers, despite all police attempts to halt the suicides. Finally the authorities decided on an experiment.  With cooperation from the local media, the suicides were given the barest minimum publicity, and the coverage was shorn of all lurid details and 'human interest' accounts of the jumper's grief-stricken family, etc.

The subway jumping suicides plummeted by 75 percent.

It was like smothering a fire. The circumscribed publicity took the oxygen out of grandstanding suicide. Given that most of the random killings happening in the USA are simply to draw large attention to what is in effect the killer's suicide, the same approach used by the Vienna authorities can be applied to cutting down the killings.
Schulman's report lists methods to strip reportage on a mass killing of its sensational details, and greatly shrink its media spotlight, thus greatly limiting the time the incident is spotlighted.  The problem with applying the tactics to American cable news TV is that the producers working at these stations now only seem to understand how reality TV works.  They don't seem to know what it means to be a news station.

But on paper, at least, I suppose they could be persuaded to at least greatly limit their coverage of the mass killings. Yet the data that Ari Schulman brings forward is clearly well known to law enforcement officials.  So a question would be whether the officials have already tried and failed to get the TV stations to stop glorifying the random killings.

Would pressure from the big advertisers help?  I think it would be worth a try.

The only other solution I can see is the passage of time.  As the mass killings continue to escalate, I'd guess it's only a matter of time before many in the public turn blame on the TV coverage, and from there, the advertisers.  Then I guess it'd be millions of people tuning out the news stations and not buying products advertised on them. Not a good prospect, but perhaps only by such means would major American companies cease to support a very deadly practice.


Tuesday, May 27

Self-Government: A closer look at Adolf Gasser's points and a brief return to Arvind Kejriwal's "Swaraj"

"An increasingly sophisticated and highly specialised society and economy can only function if decisions are taken locally and the process is decentralised ... This speaks in favour of politically non-central solutions and underscores the growing importance of communal and personal autonomy."
-- From Robert Nef's analysis of Adolf Gasser's major work

In Part Three of The Devil and Departmentalization (Beat the Devil) I looked briefly at Robert Nef's discussion of Adolf Gasser's ideas about municipal autonomy and found they came up short, when applied to unrestricted departmentalization in government agencies:
Gasser was analyzing the effects of centralized government on the patchwork of countries on the European continent, all of which combined could fit into the continental United States 2-1/2 times. So while municipal autonomy can ward off the worst effects of departmentalization gone berserk in a small country, what happens when a municipality is as large as one or more European countries?
But while I provided the link to Nef's paper and quoted Wikipedia's summary of Gasser's work, my very circumscribed area of criticism shortchanged Gasser's points as they apply to defending freedom; specifically, preventing a democratic republic from lapsing into authoritarian rule.  Yet the points hold as true today as during World War II, when Gasser surveyed the wreck of Europe's experiment in democracy.

The problem is that Gasser's magnum opus has never been translated into English (to my knowledge).  So here I again rely on Robert Nef's translation for a closer look at Gasser's major points.

With regard to the term "federalism," elsewhere in his paper Nef drily observes that an American professor collects definitions of the term as a hobby; he'd collected 495. But Nef's paper is a good grounding in the term.

Two caveats before I turn to Nef's analysis.  He writes, "Countries that have inherited systems of local autonomy dating back to times immemorial have effectively resisted both monarchic and bureaucratic centralisation in the form of absolutism as well as left- and right-wing totalitarianism."

I venture that Gasser and Nef confined their observations to Western countries.  The Devil and Departmentalization series was inspired by my reading of Arvind Kejriwal's 2011 Swaraj ("Self Rule"); specifically, his discussion of how the British Raj used a process of departmentalization to co-opt a system of self government in India's villages that had existed unbroken for thousands of years.

The system did survive, in sort of a Potemkin form after the British left.  As Kejriwal noted, after Independence the central government switched out the British Sahib for the Indian Sahib.  And so India's central government retains control over the local ones in the country.

The point is that it takes more than a good system of local government to protect it from being co-opted.  A carefully written Constitution, while not a guarantee, as Nef points out, is a prerequisite, as is a robust justice system.

But as I stressed in Beat the Devil a key factor in maintaining good local government may well be keeping it nonprofessional; i.e., a volunteer effort. This on the theory that when people have to volunteer their time to making government work, they tend to come up with solutions rather than making a career out of problems.

The other caveat is that Nef's analysis necessarily focuses on the mechanics of governing.  Yet prior to good government or undergirding it is self-sufficiency; without this, even the best decentralized governing mechanics can be co-opted by a central authority. Gasser was very clearly aware of this point. ("Municipalities would have to be in a position to secure for themselves adequate sources of revenue.")

I'll discuss the issue of self-sufficiency in upcoming posts.    

From In Praise of Non-Centralism; Section 9, Federalism and Municipal Autonomy; pp. 71-75; 2003; Robert Nef. (Nef's entire paper is available for free, in English, in PDF):

Gasser’s central concern is to illustrate the interdependence of democracy and municipal autonomy as a precondition for a permanently stable state.

During the second world war, the Swiss historian, Adolf Gasser, wrote Gemeindefreiheit als Rettung Europas ["Communal Freedom as the Salvation of Europe;" first ed. 1943], a well-reasoned and pioneering [discourse] on municipal autonomy as the saviour of Europe.

With intellectual acumen and brilliant language, Gasser develops the main thrust of his thesis: countries with democratic constitutions can only be viable if they have federalist structures and the municipalities have extensive, legally guaranteed autonomy.

According to Gasser, “internal and not external policy was responsible for the collapse of libertarian state constitutions. Democracy failed in all countries with a tradition of political freedom because, as liberty and order could not be combined into an organic whole, it was only obvious that opposing social and political forces would take over and hamper the successful development of democratic institutions” (Gemeindefreiheit, p. 8).

By liberty and order, Gasser means a socio-political constitution that is based upon and builds on municipal autonomy. Municipal freedom means the free social cooperation and classification of the individual. The will to be involved and take on responsibility within a small sphere is crucial. Countries that have inherited systems of local autonomy dating back to times immemorial have effectively resisted both monarchic and bureaucratic centralisation in the form of absolutism as well as left- and right-wing totalitarianism. In Gasser’s opinion these ‘old free’ states include Great Britain, the USA, the north European countries, the Netherands and Switzerland.

At the other end of the spectrum are the ‘liberalised authoritarian countries’ of continental Europe like Spain, France, Italy, and Germany. “The large mainland states have completely absorbed the principle of administrative command and subordination and are therefore imbued with the spirit of power […] Consequently, the modern state in Italy, Spain, Portugal, France, Germany, and Austria was also built unilaterally, from the top down. Thus individual classes of society were mechanically fused together by an administrative command and power apparatus to form a national unit, leaving the people with no opportunity in their local sphere to work together and shoulder collective responsibility for the prosperity of the state, while learning to trust each other politically” (Ibid, p. 103).

“Interpreting what constitutes the ‘state’ is very different in a world with municipal freedom and in a world without it. In the former, state order is based on the general desire for local self-administration while in the latter, it is based on general subjugation to the bureaucratic apparatus."

In Gasser’s opinion, it is, therefore, a “fundamental mistake to somehow try and compare the social and political differences of the authoritarian, centralist world with those of the communal, federative world. When situations within the hierarchy of officialdom threaten to degenerate into passion and hatred, even where there are liberal constitutions, the moderating effect of moral counter forces will prevail in conditions of wide-ranging local autonomy” (Ibid, p. 181).

Gasser’s central concern is to illustrate the interdependence of democracy and municipal autonomy as a precondition for a permanently stable state.

“Living together in freedom is viable only if an organisation has a clear and transparent structure, people know each other personally and usually judge others and members of their self-elected local governments not only by the party they belong to, but by their skills and even more by character. Such a vibrant ‘citizens’ school’ where different opinions and special interests are in constant competition to ensure that a sensible balance is achieved, can be realised only if there is a free municipal self-government” (Ibid, p. 166 and onwards).

Gasser's vision of a new Europe after 1945 carries greater weight today than ever before. “Europe can only become a world of true universal democracy, if, at the same time, it becomes a world of communalism and vibrant self-government, if steps are taken to liberate centralist countries from bureaucratic hierarchy and from the administrative principle of command and subordination, and to rebuild from the bottom up.”

To carry out this process successfully, Gasser proposes giving “prompt and strict instructions to the district bureaucracies to refrain from handling certain matters that come under the purview of the municipal administration. The partial autonomy secured in this manner will then gradually develop into a “pouvoir communal” by allocating more responsibilities elsewhere, and communal power should be legally safeguarded against intervention.

Municipalities would have to be in a position to secure for themselves adequate sources of revenue, and be given full responsibility for determining their own budget with all the self-discipline that this entails. Without being responsible for their own finances, neither the desire for living self-government nor communal social ethics will be able to flourish” (Ibid, p. 199).

Gasser however admits that there would be several hurdles impeding a “strategy of ordered withdrawal” and these would be difficult to overcome.

“In places where people have always been accustomed to an administration based on centralist bureaucratic hierarchy, demands for greater municipal autonomy are not particularly popular […] (Ibid, p. 204). Re-education should be considered only if a strong and stable government acknowledges the necessity, tackles the process methodically and gives step-by-step instruction.”

It remains to be seen whether this is a realistic method or whether change can only be induced through pressure ‘from the bottom up’.

“The rejection of the authoritarian state and the principle of administrative command and subordination that underlies all genuine communalisation ultimately require a new interpretation of the law. This means that the state should no longer be the source of all legislation; we must perceive the constituent parts of the state as upholders of their own independent laws as was the case in early and medieval European law: individuals, families and communities first, followed by districts and provinces. On no account should one be content with comprehensive federalism unless there is comprehensive and legally secure municipal autonomy” (Ibid, p. 205).

Are municipal autonomy and democracy still suitable in a service- and information-based society characterised by mobility, complex labour division, and interdependence? Are small, democratic social units and communal autonomy not rooted in static, rural-agricultural, and small-town societies that are virtually nonexistent today?

Gasser denied this and singled out municipal autonomy as the balancing element in a social and welfare-state political system. He believed that only in conditions that people understand and are true-to-life can they acquire “what is described as political intuition and a sense for human proportions […]; only here, on the ground of freedom, does a modicum of belief in the community develop that can effectively curb the tendency towards authoritarianism and anarchy […]” (Gemeindefreiheit und Zukunft Europas, p. 463).

An increasingly sophisticated and highly specialised society and economy can only function if decisions are taken locally and the process is decentralised, in other words, people take the initiative and are prepared to take over their own responsibility. This speaks in favour of politically non-central solutions and underscores the growing importance of communal and personal autonomy.


I don't know which is scarier: global warming or the mad scramble to halt warming

"Subsiding land is a bigger immediate problem for the world's coastal cities than sea level rise."

I see that some of the formatting for the Gondolas of Wall Street post is messed up and that I forgot to add a link to the USA TODAY report, Rising sea levels torment Norfolk, Va. and coastal U.S. I'll fix the formatting when I can make the time. Here I want to clarify that the "Be Here Now" approach I favor to dealing with climate change, and which I think the ten-part 2013 USA TODAY series (Weathering the Change) I mentioned in Gondolas also favors, doesn't mean rejecting out of hand the best guesses of scientists about the drivers of climate change.
A reader who responded to the Gondolas post stated flatly that CO2 emissions aren't a driver of global warming and pointed me to where I could learn about the real drivers.  (See the comment section at the post for his full comments.)  It's just this kind of flat statement, found on both sides of the climate change debate, which puts my back up. 

But I wonder how many would care all that much about the debate if it wasn't for the fact that scientists aren't just making pronouncements, governments are taking action based on the pronouncements.
This is bringing every mad inventor on the planet out of the woodwork.

The debate about whether global warming was caused by human activity picked up steam in the year 2000. That was when NASA put up satellites that began beaming back to Earth huge amounts of data that scientists and mathematicians have been trying to fit into a coherent picture about Earth's climate.  The most objective among these people are frank in saying that interpreting the data is more art than science.

But in 2013 the USA TODAY editorial board -- USA TODAY being a middle of the road newspaper -- finally took a firm stand. The Board accepted the current majority opinion among climate scientists, which is that large amounts of man-made CO2 emissions are connected with global warming. even though the emissions might not be the only driver of the phenomenon. 

Of course the majority opinion could be wrong.  But the newspaper's series on climate change takes the practical stand that this is where we are now.  And yet given some of the counterproductive attempts to deal with global warming that government has already backed; e.g., ethanol and carbon swaps, intelligent behavior going forward means not stampeding ourselves into the kind of situation that allowed NSA to turn our fears about terrorist attacks into a nightmare about surveillance.
We risk going down the same road with our attempts to stop global warming.  One article in the USA TODAY climate series outlines various technological fixes being proposed to halt global warming.  Some of the solutions sound wonderfully ingenious; some are nothing short of horrifying.  Horror would be a big price to pay for trying to solve a problem that might not even be correctly identified at this time by science.

So I think the wise course, at this point, is the Be Here Now one.  An example of this approach: We aren't absolutely certain about what's causing the sea level to rise. Yet it is absolutely certain that rising seas are only one half the problem for many large coastal cities. The other half is that the cities, the land on which they rest, is sinking. 

How much are the cities sinking, how fast are they sinking, why are they sinking? -- we need to find that out 'yesterday,' on a case by case basis.  This is because there are very sure technological solutions for the sinkage problem, the man-made aspect of the problem.  But if the sinkage isn't addressed, then when you combine sinkage with even a small rise in sea level, the big U.S. coastal cities, especially on the Atlantic coast, are staring down the barrel of disaster with every major hurricane.

This isn't even talking about what the major West Coast cities could face if they're sinking, and if an earthquake eruption sends a major tsunami their way.

I add that the sinkage problem is one climate change-related topic that USA TODAY's series didn't address, perhaps because it's not a climate change issue even though it intersects with it. And the research team that prepared the series in 2013 might not have known about the issue. It wasn't until the latter part of last year that the scope of the problem got much publicity -- although I recall that some months after Superstorm Sandy wreaked havoc on the New Jersey coast, John Batchelor interviewed a reporter about the sinkage problem on the New Jersey coast.  And as far back as 2007, London's sinkage problem had made the news, although not in the U.S..

To sum it up, the Be Here Now approach is to cover our bets. Do what we can now to address long-standing problems connected with coastal flooding that are a threat to lives and property, no matter what's causing a rise in sea levels.

Same approach can be applied to drought.  The silvopasturing method of cattle raising, which I featured in an earlier post, conserves tremendous amounts of water. That method can be implemented even more cheaply than was discussed in the article I featured, if cultivating valuable timber isn't part of the pasturing.

Can this method, long in use in the American Southeast, be transferred to the Southwest?  I think the idea is worth consideration, and there are plenty other ways to conserve water.
One Pundita reader commented that farmers in New Mexico could relocate to places in the USA such as Ohio, where water is cheap and plentiful. That might be an attractive solution for some.  However, many of the farmers have roots in the New Mexico-Arizona region that go back many centuries.

Yet if they want to stay, it will take more than tenacity. They will have to forego tradition and learn to take advantage of every method now known for conserving water.  The megadrought they're living through might have no connection with human activity; it could be cyclic and if it's part of a classic megadrought cycle, it could last for decades more. 

If you read the article by Ari LeVaux that I featured, you were hit with the stunning news that as of 2012, at least, farmers in drought-stricken regions of New Mexico were still using the flood method of irrigating their crops. This water-wasting method includes a canal system inherited from the Spaniards, who got it from the Moors. Try to imagine how many gallons of water evaporate off those canals in the blazing heat and bone-dry air! 

Ari mentioned that one of the farmers had decided it was time to convert to the drip tape method of irrigation. Ya think? 

In short, there's plenty of low-hanging fruit without restoring to solutions that include "blasting sulfate aerosols into the stratosphere to reflect sunlight away from Earth."  I am NOT making that up; see the USA TODAY article on technological fixes I linked to above.

I will leave you with the BBC's 29 April 2014 article on sinkage, or 'subsidence.'  See the Beeb's website for source links in the report and an eye-popping graphic:

Megacities contend with sinking land
By Jonathan Amos, Science Correspondent
BBC News, Vienna

Subsiding land is a bigger immediate problem for the world's coastal cities than sea level rise, say scientists.
In some parts of the globe, the ground is going down 10 times faster than the water is rising, with the causes very often being driven by human activity.

Decades of ground water extraction saw Tokyo descend two metres before the practice was stopped.
Speaking at the European Geosciences Union General Assembly, researchers said other cities must follow suit.

Gilles Erkens from the Deltares Research Institute, in Utrecht, in the Netherlands, said parts of Jakarta, Ho Chi Minh City, Bangkok and numerous other coastal urban settlements would sink below sea level unless action was taken.

His group's assessment of those cities found them to be in various stages of dealing with their problems, but also identified best practice that could be shared.

"Land subsidence and sea level rise are both happening, and they are both contributing to the same problem - larger and longer floods, and bigger inundation depth of floods," Dr Erkens told BBC News.

"The most rigorous solution and the best one is to stop pumping groundwater for drinking water, but then of course you need a new source of drinking water for these cities. But Tokyo did that and subsidence more or less stopped, and in Venice, too, they have done that."

The famous City of Water in north-east Italy experienced major subsidence in the last century due to the constant extraction of water from below ground.

When that was halted, subsequent studies in the 2000s suggested the major decline had been arrested.
Pietro Teatini's research indicates that significant instances of descent were now restricted to particular locations, and practices: "When some people restore their buildings, for example, they load them, and they can go down significantly by up to 5mm in a year." How far they descended would depend on the type and compaction of soils underneath those buildings, the University of Padova researcher added.

Like all cities, Venice has to deal with natural subsidence as well.

Large-scale geological processes are pushing the ground on which the city sits down and under Italy's Apennine Mountains. This of itself probably accounts for a subsidence of about 1mm each year. But on the whole, human-driven change has a greater magnitude than natural subsidence.

Scientists now have a very powerful tool to assess these issues. It is called Interferometric Synthetic Aperture Radar. By overlaying repeat satellite images of a specific location, it is possible to discern millimetric deformation of the ground.

Archives of this imagery extend back into the 1990s, allowing long time-series of change to be assessed.
The European Space Agency has just launched the Sentinel-1a radar satellite, which is expected to be a boon to this type of study.



Saturday, May 24

The Gondolas of Wall Street

"We don't know how much sea levels will rise or fall," says Myron Ebell, director of the Center for Energy and Environment at the Competitive Enterprise Institute, standing knee deep in flood water during his interview with a USA TODAY reporter.

Okay, I made up that last part. I couldn't resist. Ebell's outfit is "a libertarian research group funded partly by fossil fuel interests. He says models can't reliably predict the climate, because its changes are 'nonlinear' or irregular, so flood walls and other measures might waste money."

For crying out loud, even the Department of Defense acknowledges there's a problem with rising sea waters:
The Army Corps of Engineers did a three-year case study, released [in November 2013], that found Naval Station Norfolk's vital infrastructure won't survive the powerful storms and flooding expected in the latter half of this century. In another report [in 2013], DOD said about 10% of its coastal facilities are at or near sea level and are "already vulnerable to flooding and inundation."
Retired Navy captain Joe Bouchard, who commanded Norfolk's naval base between 2000 and 2003, says he expects DOD's analyses will help it decide which bases to save — and how. He wishes he had such information when he replaced two of the base's single-deck piers with double-deckers.
"We got it wrong," he says, noting the new piers aren't high enough to withstand more than a foot of future sea-level rise. "We weren't thinking about climate change, period."
The landlubbers who live far inland can read the rest of USA TODAY's December 2013 report, Rising sea levels torment Norfolk, Va., and coastal U.S. to understand that flooding from rising sea levels isn't a future scenario for America's big cities, the ones located on a coast, which means most of them. It's happening now.  The streets of Miami now flood at every lunar high tide.  As for the 400-year old city of Norfolk, it's turning into marshland.

How much of this is due to the burning of fossil fuels, to "man-made" global warming, or to a cyclic pattern in the Earth's life or coastal lands sinking under the weight of mega-cities built on their edges, or all of the above -- once you're taking incoming fire it's not really the appropriate time to wonder exactly how you ended up in such a stupid situation.

We're here now and we have to do something about it. That's the message of the ten-part series on climate change that USA TODAY fielded last year:The seas have risen and fallen before. What's new is the enormity of coastal development that will need to be protected, moved or abandoned.The series, a tour de force by reporter Wendy Koch, struck the right note, in my view. 

I don't want to hear I'm a Climate Change Denier in the manner of a Holocaust Denier.  I am not alone.  Many people are still skeptical of the scientific research and mathematical models connected with climate issues. And few things are as annoying as watching opposing camps of scientists hurl data sets at each other and call each other crazy. The USA TODAY series speaks to people like me.
And I suspect that the most intelligent among the environmentalists are distancing themselves from the more strident Man-made Global Warming activists, on the theory that screeching at people for their sins doesn't fly well outside a Revival meeting. Even governments are figuring this out.  The USA TODAY article notes that when the Virginia Legislature ordered a study of the flooding issue it avoided using divisive terms such as "climate change" and even "sea-level rise."  Instead, the study was called a "recurrent flooding" analysis.

But as the article makes clear, this is not the time to be playing ostrich. The observation goes double for Wall Street denizens who like the way Myron Ebell puts things.  We'll see how nonlinear they like it when they're taking a gondola to work and snorkeling to a lunch date.


Are we ready to dispense with brick and mortar government?

"[T]he ‘distributed network’ format, expressed in the specific manner of peer to peer relations, is a new form of political organising and subjectivity, and an alternative for the current political/economic order', i.e., I believe that peer to peer allows for ‘permission-less’ self-organisation to create common value, in a way that is more productive than both the state and private for-profit alternatives."
-- Michel Bauwens - P2P Foundation

Cyber prophets such as VRML developer Mark Pesce say that clear signs of virtual government are already here: "People connected [on the Internet] in their numbers simply overwhelm, outperform, and thrust aside all obstacles ... that you can put in their way; and this is where we are right now."

To support his observation Pesce pointed to the way that citizens from all around Japan 'self organized' to take the load off their government's attempts to monitor the radiation spill from the flooded Fukushima Daiichi nuclear reactor. The government didn't have enough workers to monitor with exactitude where the leaking radiation was traveling.  So the volunteers used personal Geiger counters to monitor radiation levels in their neighborhoods. Then they entered the data at a website called, creating a countrywide map of the radiation levels.

This Internet based self-organizing group effort to deal with a specific task or issue, also called crowdsourcing, adhocracy, smart mobs, peer-to-peer networking and swarming, is still in its infancy but more and more citizens are finding ways to supplement government efforts by using the Internet as a kind of virtual government office complex, if you will.

And as my Take a Memo post emphasized in humorous fashion, much of what we call government is just record keeping -- records that no longer need to be physically sited because they can be stored virtually.  Even at this early stage of cloud technology, a lot of government real estate is just taking up space because it's no longer needed for record storage.

Yet when you consider the scope of cybersecurity problems that Edward Snowden's revelations have turned up, we are still a long way from being able to dispense with brick and mortar government and the large bureaucracies that go with it.  Indeed, the optimistic observations from Mark Pesce were made years before the NSA scandal broke.(1)

But with this caution in mind, Pesce was right.  Do It Yourself government is already on humanity's horizon; the following report is an amazing illustration of this trend.  Note that the self-testing kit is just one part of the story. The other part is that by entering data from the kit results on a website, citizens will be taking on an important function of America's national public health agency, the Centers for Disease Control. So far, the CDC seems quite happy about this. We'll see how happy they are when the trend accelerates, but for now it's all smiles and applause.

(See the Common Health website for the links in the article text): 
January 10, 2014
Common Health Organization
Rapid home flu test distributed by GoViral

You’re aching, you’re shivering, you’re coughing. You’re definitely, miserably sick, but is this real, potentially serious flu or just some garden-variety winter crud?

Better find out. You pull your handy-dandy virus test kit from the shelf, insert the nasal swab gently into your nostril and twist it around three times to coat it with your (copious) mucus. You swish the swab in liquid and deposit drops of your germy mix on the four wells of the instant test. Ten minutes later — voila. Sure enough, you test positive for an influenza type A. You call your doctor to ask about anti-viral meds, and — as a good citizen of your disease-tracking community – you go online to report your diagnosis to Flu Near You. On its map, you see that you’re not alone: a dozen of your neighbors have the same bug.

Futuristic? Not if you live in the Boston area and are part of a new flu-tracking experiment funded by the National Science Foundation, called GoViral. Run by researchers at Boston Children’s Hospital, the three-year project is just getting underway now, as this year’s flu season takes on steam.

Flu is more than a nuisance. It’s a serious threat — infecting tens of millions of Americans a year and killing an average of 24,000 — and public health types try hard to track and understand it. The CDC monitors reports from doctors’ offices, including lab test results. Google Flu Trends watches online searches for telltale symptoms. Flu Near You, where GoViral is based, already brings together thousands of volunteer sentinels who report online when they have symptoms.

Now, GoViral will take testing into the home, where many flu patients hole up rather than seeing the doctor.
“It’s never been done before, to give a lot of people in their homes these tests,” said Dr. Rumi Chunara, GoViral’s lead researcher. “This is the first time that we’re actually crowdsourcing diagnostic samples from people.”

The project breaks new ground in flu tracking, said Dr. Lyn Finelli, who leads flu surveillance and response at the National Center for Immunization and Respiratory Diseases at the CDC: “This is the first time that I know of that anybody has used what we call participatory surveillance,” she said, “where people indicate whether they’re well or ill, and participate in home testing and send the tests in. This is a very novel look at a surveillance system and home testing.”

Dr. Chunara plans to distribute several hundred free flu test kits to Boston-area members of the public who sign up (here) this winter, and expand to encompass more areas.

1) Last year I took the quotes from Pesce's Wikipedia page, but the updated version of the page omits them. 


Thursday, May 22

With apologies to Dorothy Parker: Tonstant NSA Soap Opera Watcher Pundita Frowed Up

In the last Pundita post I mentioned a former U.S. Attorney General's use of metaphor in the attempt to talk me into abandoning reason and the evidence of my eyes.  In what I am beginning to fear is a grand tradition, the present Attorney General, Eric Holder, and his Department of Justice have redefined the meaning of lying.

At the Guardian on May 17 Trevor Timm launched into a summary of the DOJ's attempt to play with people's heads by chirping at the reader, "If you blinked this week, you might have missed the news ..."
Yes, actually I had missed the news despite my best efforts to keep up with every episode of the NSA soap opera, and I wish I'd continued to miss it.  But by now I know how it is with the people over at the Guardian: if there's really bad news connected with the NSA surveillance story they're going to make sure the public doesn't miss it.

The news I'd missed is that two U.S. Senators have accused the DOJ of lying last year to the Supreme Court about NSA warrantless surveillance.  Trevor helpfully adds, "and those falsehoods all but ensured that mass spying on Americans would continue."

Unfazed by the Senators' accusation the Justice Department explained that what looked like lies actually wasn't.  And as a loyal American who supports my government catching every terrorist scheming to attack the USA, I'm supposed to swallow this treacle without throwing it up.

To pick up the story at the Guardian (see the website for numerous source links in the text): 
Here's what happened: just before Edward Snowden became a household name, the ACLU argued before the supreme court that the FISA Amendments Act – one of the two main laws used by the NSA to conduct mass surveillance – was unconstitutional.

In a sharply divided opinion, the Supreme Court ruled, 5-4, that the case should be dismissed because the plaintiffs didn't have "standing" – in other words, that the ACLU couldn't prove with near-certainty that their clients, which included journalists and human rights advocates, were targets of surveillance, so they couldn't challenge the law.

As the New York Times noted this week, the Court relied on two claims by the Justice Department to support their ruling:

1) that the NSA would only get the content of Americans' communications without a warrant when they are targeting a foreigner abroad for surveillance, and

2) that the Justice Department would notify criminal defendants who have been spied on under the Fisa Amendments Act, so there exists some way to challenge the law in court.

It turns out that neither of those statements is true – but it took Snowden's historic whistleblowing to prove it.

One of the most explosive Snowden revelations exposed a then-secret technique known as "about" surveillance. As the New York Times first reported, the NSA "is searching the contents of vast amounts of Americans' e-mail and text communications into and out of the country, hunting for people who mention information about foreigners under surveillance."

In other words, the NSA doesn't just target a contact overseas – it sweeps up everyone's international communications into a dragnet and searches them for keywords.

The Snowden leaks also pushed the Justice Department to admit – contrary to what it told the court – that the government hadn't been notifying any defendants they were being charged based on NSA surveillance, making it actually impossible for anyone to prove they had standing to challenge the FISA Amendments Act as unconstitutional.

It's unclear how much Solicitor General Donald Verrilli knew when he told the government's lies – twice – to the justices of the Supreme Court. Reports suggest that he was livid when he found out that his national security staff at the Justice Department misled him about whether they were notifying defendants in criminal trials of surveillance.

And we don't know if he knew about the "about" surveillance that might well have given the ACLU standing in the case. But we do know other Justice Department officials knew about both things, and they have let both lies stand without correcting the record.

Lawyers before the Supreme Court are under an ethical obligation to correct the record if they make false statements to the Court – even if they are unintentional – yet the Justice Department has so far refused. As ACLU deputy legal director Jameel Jaffer explained, the Justice Department has corrected the record in other cases where it was much less clear-cut whether it had misled the court.

The government's response, instead, has been to explain why it doesn't think these statements are lies. In a letter to Senators Ron Wyden and Mark Udall that only surfaced this week, the government made the incredible argument that the "about" surveillance was classified at the time of the case, so it was under no obligation to tell the Supreme Court about it.

And the Justice Department completely sidestepped the question of whether it lied about notifying defendants, basically by saying that it started to do so after the case, and so this was somehow no longer an issue.

But there's another reason the government wanted any challenge to the FISA Amendments Act dismissed without being forced to argue that it doesn't violate the Fourth Amendment: it has an extremely controversial view about your (lack of) privacy rights, and probably doesn't want anyone to know. As Jaffer wrote here at the Guardian earlier this week, the government has since been forced to defend the FISA Amendments Act, and it's pretty shocking how they've done it. Here's what the government said in a recent legal brief:

"The privacy rights of US persons in international communications are significantly diminished, if not completely eliminated, when those communications have been transmitted to or obtained from non-US persons located outside the United States."

This is an incredibly radical view of the right to privacy. We already know the government does not think you have any right to privacy when it comes who you talk to, or when, or for how long, or where you are while you're talking.

Now the government has said, in court, that you don't have any right to the content of private conversations with anyone who is located outside the United States – or to any domestic communication remaining private if it is, at some point, transmitted overseas, which happens often. Jaffer explained the consequences of this view:

"If the government is right, nothing in the Constitution bars the NSA from monitoring a phone call between a journalist in New York City and his source in London. For that matter, nothing bars the NSA from monitoring every call and email between Americans in the United States and their non-American friends, relatives, and colleagues overseas."

Intelligence director James Clapper's infamous lie to Congress – in which he claimed just months before Snowden's leaks that the NSA was not collecting data on millions of Americans – will certainly follow him for the rest of his career even if it never leads to his prosecution. But while Clapper almost certainly broke the law, the Senate committee members in front of whom he spoke knew the truth regardless.

The Justice Department, on the other hand, convinced the Supreme Court to dismiss a case that could have dramatically curtailed the NSA's most egregious abuses of power based on false statements. And now all of us are forced to live with the consequences of that.


NSA Soap Opera: Someone call the Metaphor Police

After 9/11, the excuse for missing clues was too much data: "Trying to sip from a fire hose."  But with the priority now to excuse NSA spying, the metaphor is for more data: "You can’t find a needle in a haystack without a haystack" – a shift that former FBI agent Coleen Rowley dissects."
-- Consortium News ("Independent journalism since 1995"), October 7, 2013

Coleen Rowley's analysis at Consortium News of Sen. Diane Feinstein's argument for greater NSA data collection on U.S. citizens (which if I recall Feinstein backed away from a little under the continuing onslaught of released classified NSA files) is a highly informed and sometimes wryly funny fisking of attempts by NSA apologists to rewrite the history of intelligence failures related to the 9/11 attack. It's also a crash course on the problems with the metadata approach to intelligence gathering and how these are made worse by the practice of stovepiping; i.e., departments withholding key information. She reports as an insider and eyewitness to bureaucratic failures that caused critical data about the 9/11 plotters to elude the U.S. military's attention.

Fast forward to the May 13, 2014 publication of Glenn Greenwald's book, No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State and the Wall Street Journal review of the book on May 14. The review was penned by Michael B. Mukasey, a former Attorney General of the United States and former U.S. District Judge for the Southern District of New York.
Mukasey goes to lengths to portray Greenwald as paranoid but he overshoots the mark when he attempts to persuade the reader that the evidence of his own eyes can't be trusted.  This has to do with a Power Point chart in a leaked NSA file that was featured in Greenwald's book. Mukasey's review does not actually show the graphic, although Conor Friedersdorf's review of the book (A Conservative Critique of a Radical NSA) at Slate magazine does.

You can see for yourself at the Slate website that the graphic very clearly, very unambiguously, shows that NSA and its counterparts in the "Five Eyes" alliance were determined, at least as late as 2011, to collect every bit of data they possibly could that had an electronic signature.

Not so, explains Mukasey. This chart "is apparently from what is known as a "Signals Development" conference discussing what to do when you secure a major new source of electronic intelligence so that the wheat of useful information can be separated from the chaff of irrelevant communications."

Cold sober I don't know what wheat and chaff have to do with the chart, but I can report that after downing four glasses of absinthe I did see a certain logic in Mukasey's use of the metaphor.  It's just that I can't remember what it was.


House passes a joke called a bill to reform NSA surveillance

Mother Jones reported today The NSA Bill That Just Passed Is So Weak, Original Backers Voted Against It. The report is accompanied by a great photograph of an American bald eagle. The eagle looks as if its hanging its head in shame. 

Well, yeah.
Earlier: cyber experts noticed loopholes a mile wide in Obama's proposed NSA reforms. The scariest part? "The White House has also declined to spin off the NSA’s defense mission from its more dominant intelligence-gathering mission."  That continues the Alexander era practice of putting a spy agency DIRECTLY in charge of cyber warfare. 

Obama's NSA spying reforms fail to satisfy cyber experts
May 17, 2014

“That is the loophole that swallows the entire policy, because there’s always going to be an important national security or law enforcement purpose."

WASHINGTON — Obama administration actions to change some of the National Security Agency’s surveillance practices after the leaks of classified documents by contractor Edward Snowden are falling short of what many private cyberexperts want.

Top government experts told the Reuters Cybersecurity Summit this week they would be more transparent about spying activity. Nongovernment guests, however, said the administration wasn’t doing enough to advance Internet security.

For instance, last December a White House review commission called for a drastic reduction in the NSA’s practice of keeping secret the software vulnerabilities it learns about and then exploiting them for spying purposes.

White House cybersecurity advisor Michael Daniel said at the conference that he would lead the interagency group charged with weighing each newly discovered software flaw and deciding whether to keep it secret or warn the software maker about it.

“The policy has been in place for a number of years, but it was not as active as we decided that it should be,” Daniel said. “(Now) there is a process, there is rigor in that process, and the bias is very heavily tilted toward disclosure.”

Commission member Peter Swire told the summit he was pleased by the formal process for debating vulnerability use, but others said there were too many loopholes.

In an April 28 White House blog post, Daniel wrote that the factors the interagency group would consider included the likelihood that the vulnerability would be discovered by others and how pressing was the need for intelligence.

“That is the loophole that swallows the entire policy, because there’s always going to be an important national security or law enforcement purpose,” Chris Soghoian, a technology policy analyst with the American Civil Liberties Union said at the summit.

Some security experts active in the market for trading software flaws said they had seen no dip in U.S. purchases.

“There’s been no change in the market at all as far as we can see,” said Adriel Desautels, chief executive of Netragard Inc, which buys and sells programs taking advantage of undisclosed flaws.

The White House has also declined to spin off the NSA’s defense mission from its more dominant intelligence-gathering mission, as the commission recommended. New NSA Director Michael Rogers told the summit that the agency could keep doing both offense and defense and that “a good, strong Internet is in the best interest of the nation.”

The review commission implicitly acknowledged that the NSA had developed the capability to penetrate some widely used cryptography. And it urged the NSA to commit to not undermine encryption standards.

The White House has issued no policy statement in response.

Daniel said, “(Officials) do not have any intention of engineering vulnerabilities into algorithms that undergird electronic commerce.”

Critics say such statements leave ample wiggle room. Among other things, they don’t preclude using backroom deals.

For instance, the Snowden documents published by journalists say Microsoft Corp. had worked with the NSA to let the agency obtain access to some user emails before they were encrypted.

“The way most crypto gets broken is through implementation,” Swire said. “How you set up crypto is very important.”

According to Snowden documents, the NSA has hacked into Google and impersonated Facebook overseas, where it faces far fewer restrictions on what it can collect. The NSA has said nothing about changing such tactics.

Tuesday, May 20

Megadrought: Ari LeVaux had me at "Hello"

Ari LeVaux is a food columnist based in Placitas, New Mexico and a contributor to Writers on the Range, a service of High Country News. One day in 2012 he sat down and wrote an article about water scarcity in his own neighborhood, what it was doing to the people there, how they were reacting.  The writing did what all the world's climate scientists, environmentalists and global warming activists had never managed: caused me consider that it just might be late in my country's day.

The reports on drought I've posted on this blog in recent days were gathered because I stumbled across Ari's story in 2013, and my interest in silvopasturing is also a direct result of my reading it. So I think it's fitting that to round out my series on drought I post his writing:

Megadrought, the new normal
by Ari LeVaux
July 27, 2012
Writers on the Range, High Country News

In a dirt parking lot near Many Farms, Arizona., a Navajo farmer sold me a mutton burrito. He hasn't used his tractor in two years, he told me; he has to cook instead of farm because "there isn't any water." He pointed east at the Chuska Mountains, which straddle the New Mexico border. In a normal year, water coming off those mountains reaches his fields, he said. No more.

His experience might just be the new normal for the American Southwest, writes William deBuys in his book, A Great Aridness. It was published late last year, months after one of the Southwest's driest summers in recorded history, during which fires of unprecedented size scorched hundreds of thousands of acres of forest.

This summer is even worse; forest fires have already broken last year's records. Springs, wells and irrigation ditches are bone-dry. Farms are withering. We've all heard the gloomy scenarios of global warming: extreme weather, drought, famine, the breakdown of society. My current perch in Placitas, N.M., feels like a front-row seat at the apocalypse.

Yet deBuys says we don't really know if the current drought in the Southwest is a consequence of global warming. Periodic, decades-long droughts have been relatively common in the last few thousand years, according to analysis of dried lakebeds. Most of the area's famously collapsed civilizations -- Chaco Canyon, Mesa Verde, the Galisteo pueblos -- are thought to have died out for lack of water in these extended dry periods, which deBuys calls "megadroughts."

By contrast, the last century's human population growth in the American Southwest occurred during a relatively wet period in the climactic record. We were due for another megadrought sooner or later, deBuys says, though climate change could make that dry event come sooner.

In the Sandia Mountains above Placitas, last winter's snowpack was relatively high. But the spring runoff never came because the snow evaporated straight into the air during the hottest spring on record. Lynn Montgomery has been farming in Placitas for more than 40 years. Like many farmers in northern New Mexico, he irrigates his land with water from an acequia, a type of canal system implemented by Spaniards, who'd adopted the technique from the Moors. This year, for the second year in a row, Montgomery's acequia has run dry. Last year, summer rains came in time to save his crops, but this year they haven't come.

First to go were the young Italian prune trees. His more established pear trees were next. Now, his decades-old grape vines are dropping their fruit and clinging to their lives. The 30-year-old asparagus patch is toast, as are the perennial herbs, garlic and strawberries. Even the weeds are dead.

The farm was part of a thriving community in the 1960s and '70s. Then people gradually left; Montgomery was the last man standing. He sold the farm to the local Pueblo Indian tribe, on the condition that they assume ownership after his death. He spent the proceeds paying lawyers to enforce water law around Placitas, managing to stop several developments that would have tapped the fragile aquifer.

Despite his successes, many wells were drilled, especially in the 1980s and 1990s, dropping the water table to the point at which many springs in Placitas began running dry, along with the acequias they feed.

Montgomery's neighbors, with the turn of a tap, can still water their grass and wash their cars, thanks to the wells that killed the spring that feeds his acequia. But it's only a matter of time, he told me, until they feel his pain –– literally.

Harold Trujillo is member of an acequia near Mora, N.M. All the acequias in his Sangre de Cristo mountain valley, near the headwaters of the Pecos River, are dry, he told me. Before this year, the worst he remembered was 2002, which, according to the Colorado state engineer's office, was the region's driest year in the last 300.

"In 2002, there were natural ponds that never dried up. Cows could drink out of them. Now those ponds are dry. People have been digging them deeper with backhoes to get them to fill with water," Trujillo said.

Tempers are also getting short. Trujillo said he was verbally threatened last weekend at Morphy Lake, the reservoir his acequia association helped build, by people wanting more water released now.

Meanwhile, Lynn Montgomery is retooling his farm. He's installed a holding tank, in which he'll be able to store precious acequia flow in future years, before it goes dry again. And he's switching from traditional flood irrigation, the way it's always been done in Placitas, to more efficient drip tape. Perhaps ingenuity and resilience will help him cope with the new normal.


The other drought monitor and what it portends

First, a note on my use of the term "megadrought" in the previous post: While a megadrought can be considered a drought of longer than a decade's duration, the definition established by science is "a prolonged drought lasting two decades or longer," according to Wikipedia.  See Wikipedia for more discussion of the issue, and also note that the Climate Central article in this post quotes two scientists who speak of the current drought of 13 years' duration in the USA as a megadrought.

One thing the scientists agree on:  "The term megadrought is generally used to describe the length of a drought and not its acute intensity," according to Wikipedia. The Dust Bowl drought, while not a megadrought, was 10 years of hell for the people who lived through it. 

And yet it was human actions in the Dust Bowl region that helped create some of the worst conditions for families who hacked it out.  So there was the cyclic, drought-inducing weather pattern, and there was that other factor: people. 
May 15, 2014; Wired Magazine: Betsy Mason writes about a U.S. Drought Monitor/ NASA Earth Observatory map showing drought conditions in the USA as of May 6:  Map Shows Half of the US Suffering Drought Conditions. "[...] As scary as the map is, it doesn’t convey the true severity of the situation because the impacts are cumulative from several years of drought, particularly in the southwest centered on Northern Texas."

The United States Drought Monitor, which relies greatly on satellite technology, is only 14 years old, but it's been a great boon. However, there's another drought monitor, one far older. It's trees and tree rings.

It wasn't until this era that the reading of tree rings for their information on periods of drought weather became a science, known as dendrology, and also made it possible to 'read' trees that were preserved underwater, sometimes for centuries, for the data they could reveal on drought periods.

The thing about trees is that they're very accurate record-keepers. They don't make mistakes.  Let's see what the trees are telling dendrologists about megadroughts in the USA. From Caroline Fraser's June 20, 2013, analysis for Yale's e-360, Megadrought in U.S. Southwest: A Bad Omen for Forests Globally:

[...] With a highly variable climate, the Southwest boasts perhaps the best-studied megadrought history in the world. It’s the home of dendrology, the science of studying tree-rings, first developed at the University of Arizona.

The pronounced seasonality of hot summers followed by cold winters produces well-defined rings, while archaeological fascination with Southwestern cultures — Chaco Canyon, Mesa Verde, and other sites where ancient peoples flourished and disappeared — has supported the collection and study of centuries of tree-ring data.

Temperate-zone trees lay down wider rings in wet years, which narrow or vanish during drought. What’s more, rings can be precisely dated, with sets matched against each other, revealing burn scars and patterns of climate, precipitation, drought stress, and tree mortality.

Park Williams, a young bioclimatologist and postdoctoral fellow at Los Alamos National Laboratory, has teamed up with other specialists at the U.S. Geological Survey (USGS) and the University of Arizona to wring new insight from the data set spanning the years 1000 to 2007.

Driving recently into the Jemez Mountains near his office, we pass rust-red pines, dead or dying from drought. Later, kneeling next to a freshly cut stump, he points to a ring near the bark.

“That thick ring right there is probably 1998,” he says, a wetter El Niño year.

Armed with 13,147 such site-specific cross-sectioned specimens, gathered from more than 300 sites, Williams and his co-authors devised a new “forest drought-stress index,” integrating tree-ring measurements with climatalogical and historical records for a paper published earlier this year in Nature Climate Change.

Winter precipitation has long been thought important to tree growth, but another key variable leapt from this fresh examination of the data, related to a warmer, dryer climate: the average vapor pressure deficit during summer and fall, which is driven by temperature.

As air grows warmer, its capacity to hold water vapor increases exponentially, which speeds evaporation and sucks more moisture out of trees’ leaves or needles, as well as the soil itself.

If the vapor pressure deficit sucks out enough moisture, it kills trees, and there’s been a lot of that going on.

Looking back in time through the tree rings, Williams determined that the current Southwest drought, beginning in 2000, is the fifth most severe since AD 1000, set against similarly devastating megadroughts that have occurred regularly in the region.

One struck during the latter 1200s (probably driving people from the region) and another in 1572-1587, a drought that stretched across the continent to Virginia and the Carolinas. Few conifers abundant in the Southwest — including piñon, ponderosa pine, and Douglas fir — survived that latter event, despite lifespans approaching 800 years; those species have since regrown.

The forest drought stress index correlates strongly with these periods, while 20th-century temperature records show a connection between drought and tree mortality associated with huge wildfires and bark-beetle outbreaks, such as the devastating ones of the past two decades.

Williams’ study is also supported by satellite fire data from the past few decades, revealing an exponential relationship between drought stress and areas killed by wildfire.

His projections, based on climate forecasts, sparked grim headlines throughout the region: If the climate warms as expected, forests in the Southwest will be suffering regularly from drought stress by 2050 at levels exceeding previous megadroughts.

After 2050, he calculates, 80 percent of years will exceed those levels. “The majority of forests in the Southwest probably cannot survive in the temperatures that are projected,” he says.[...]

This still doesn't explain whether the current drought in the U.S. is a true megadrought.  For insight on the answer, which is still a matter of debate among climatologists, I turn to the best plain-English examination of the question I've found on the Internet. (See the website for source links.):

Is the West’s Dry Spell Really a Megadrought?
By Bobby Magill
December 12, 2013
Climate Central

SAN FRANCISCO — The drought that has been afflicting most of the Western states for the past 13 years may be a “megadrought,” and the likelihood is high that this century could see a multi-decade dry spell like nothing else seen over the past 1,000 years, according to research presented at the American Geophysical Union Fall Meeting on Wednesday and Thursday.

Today, drought or abnormally dry conditions are affecting every state west of the Mississippi River and many on the East Coast, with much of the Southwest under long-term severe, extreme or exceptional drought conditions. While drought conditions nationwide are down this year, they remain entrenched in the West.

Since 2000, the West has seen landscape-level changes to its forests as giant wildfires have swept through the Rockies and the Sierra Nevada, bark beetles have altered the ecology of forests by killing countless trees and western cities have begun to come to terms with water shortages made worse by these changes as future snowpack and rainfall becomes less and less certain in a changing climate.

“The current drought could be classified as a megadrought — 13 years running,” paleoclimatologist Edward Cook, director of the Tree Ring Laboratory at Columbia University’s Lamont-Doherty Earth Observatory in Palisades, N.Y., said at an AGU presentation Wednesday night. “There’s no indication it’ll be getting any better in the near term.

But the long period of drought the West is currently experiencing may not be a product of human-caused climate change, and could be natural, he said.

“It’s tempting to blame radiative forcing of climate as the cause of megadrought,” Cook said. “That would be premature. Why? There’s a lot of variability in the system that still can’t be separated cleanly from CO2 forcing on climate. Natural variability still has a tremendous impact on the climate system.”

Tree ring data show that decades-long droughts have occurred before humans started emitting greenhouse gases that fuel climate change. Long-lasting drought events have been tied to fluctuations in ocean conditions, which can alter large-scale weather patterns. For example, when the tropical Pacific Ocean is cooler than average, but the Atlantic Ocean is unusually mild — as has been the case during the past several years — there is a higher risk of drought in parts of the West and Central U.S.

The area of the West that was affected by severe drought in the Medieval period was much higher and much longer than the current drought, tree ring data show.

It is “indeed pretty scary,” Cook said. “One lasted 29 years. One lasted 28 years. They span the entire continental United States.”

Two megadroughts in the Sierra Nevada of California lasted between 100 and 200 years.

Cook is among the first to suggest that the current drought in the West is a megadrought, which is typically defined as a widespread drought lasting for two decades or longer, Cornell University assistant professor of earth and atmospheric sciences Toby Ault said during an AGU presentation Thursday.

But the idea that the current 13-year dry spell will be of similar magnitude of the megadroughts found in tree ring records is subject of debate.

“Are we in a megadrought? I guess we are,” Ault said. “They are a threat to civilization in the future.”

Ault is studying the probability that the U.S. will experience a megadrought this century on the order of no other dry period seen here at any time in the last millennium.

Data gleaned from tree rings and other sources show that the chance of a decade-long drought in the U.S. this century would be about 45 percent, and a multi-decade-long drought less than 10 percent, he said. 

“That’s not the whole picture because we’re going to see climate change in this century,” he said.

He said that the chances of a widespread multi-decade megadrought are high in the worst-case scenario, but he quoted University of Arizona geosciences professor Jonathan Overpeck to characterize the chances of megadrought in less severe scenarios: “It’s extremely non-negligible, the risk of prolonged multi-decadal megadrought.”

The bottom line: “The picture looks like we’re going to have to take this seriously,” Ault said.

Such dry spells would have severe implications for the nation’s water supply, and the U.S. is going to have to adapt and find smarter ways to cope, he said.

The current drought is occurring at a time of sweeping and abrupt changes in the nation’s forests as a result of both the extended dry period and human-caused climate change, said Lisa Graumlich, dean of the College of the Environment at the University of Washington.

Speaking at AGU on Wednesday, Graumlich said vast ecosystem changes are happening at an unprecedented scale across the country as tree mortality in Western forests is increasing dramatically, partly because bark beetles are spreading widely as summer warm seasons are longer than before.

“The time in which forests are burning in the West is much longer than it was in previous decades,” she said. “Forest insects are erupting across the West.”

Those changes and others including loss of sea ice, longer growing seasons in the Arctic, tundra being replaced by forests and shrubs, are occurring across an area scientists haven’t seen before, Graumlich added.

“We’re seeing right now ecosystem tipping points. They’re at an unprecedented spatial scale. They’re related to timing of biological events that ecologists are finding surprising.”


Monday, May 19

Stop illegal immigration to USA until we learn whether the drought is a megadrought

This isn't the next post I mentioned earlier; I'll be publishing that one later tonight.  But I want to write up this warning while it's on my mind. There are two immediate and very severe negative consequences of a real megadrought once it gets seriously underway:

1.  Diaspora. This happened even during the Dust Bowl era in the USA, even though the Dust Bowl drought wasn't a megadrought because it didn't last long enough to qualify; it only lasted 10 years.  Technically, parts of the USA are already in a megadrought because the drought has lasted longer than 13 years. But a real megadrought, which the United States hasn't experienced in modern times, is guaranteed to create a massive displacement of people that will happen very quickly.

This will mean millions of Americans fleeing the state or region where the drought is the worst, all them seeking jobs and a place to live, any kind of job and domicile they can get.  This will put them in direct competition with illegal immigrants for jobs and a place to live, not to mention social services in U.S. states -- states that are already facing big budget shortfalls and a lack of affordable housing and jobs.

To return to the Dust Bowl crisis, it coincided with the Great Depression.  So, many of the Americans who piled into California weren't actually "Okies" (residents of the Dust Bowl region in Oklahoma) even though the label was applied to all the newcomers to the state.  Many were white collar workers from around the country and even back East, desperately seeking any kind of job they could get, and any place to live.  This was happening at a time when illegal immigration wasn't a problem, but still it caused great dislocations and a lot of social unrest in California.  (This aspect of U.S. history was addressed in the second part of Ken Burns' Dust Bowl documentary.) 

2.  Severe water shortages. I've seen reports that mention there are already water shortages in various parts of the USA because of the drought. However, at present I don't have a breakdown of exactly where this happening and the severity in each instance.
If any reader can find such data, please email it to me at  (The comment section at this blog only works sporadically.)

What is clear at this point is that much of the illegal immigration is coming into or at least through the southwestern USA -- just the place where the drought is the worst, and the place where a megadrought, if it is that, is now getting underway. 

If it is a real megadrought, you don't want to think about the kind of water shortages this is going to create, long term, and what this is going to do to American society. And our social services.

The reality is that this is the wrong time for the United States to be opening its arms to immigrants -- legal or otherwise -- but while legal immigration can't be stopped, at least not readily, the Mexican government can be pressured to cease forwarding its own illegal immigrants from further south to the USA.

Of course this will cause bad feelings among Mexican officials and upset Americans angling to set up factories in Mexico. And it will infuriate the Democratic Party leaders, who encourage illegal immigration to shore party supporters, and Republican Party leaders, who only make a show of being against illegal immigration.
I don't know what to tell these people, beyond, say "No" now, while doing so doesn't have to be backed by the most ruthless measures. You can always turn around if the signs about megadrought are a false alarm. I will examine that question of signs in the next post.