NEWS2U Media
The Truth Mainstream Media Avoids

Thursday, November 28, 2013

Healthcare.gov and the Gulf Between Planning and Reality

By Clay Shirky
Weblog
Nov. 19, 2013
[Emphasis added]

Back in the mid-1990s, I did a lot of web work for traditional media. That often meant figuring out what the client was already doing on the web, and how it was going, so I’d find the techies in the company, and ask them what they were doing, and how it was going. Then I’d tell management what I’d learned. This always struck me as a waste of my time and their money; I was like an overpaid bike messenger, moving information from one part of the firm to another. I didn’t understand the job I was doing until one meeting at a magazine company.

The thing that made this meeting unusual was that one of their programmers had been invited to attend, so management could outline their web strategy to him. After the executives thanked me for explaining what I’d learned from log files given me by their own employees just days before, the programmer leaned forward and said “You know, we have all that information downstairs, but nobody’s ever asked us for it.”

I remember thinking “Oh, finally!” I figured the executives would be relieved this information was in-house, delighted that their own people were on it, maybe even mad at me for charging an exorbitant markup on local knowledge. Then I saw the look on their faces as they considered the programmer’s offer. The look wasn’t delight, or even relief, but contempt. The situation suddenly came clear: I was getting paid to save management from the distasteful act of listening to their own employees.

In the early days of print, you had to understand the tech to run the organization. (Ben Franklin, the man who made America a media hothouse, called himself Printer.) But in the 19th century, the printing press became domesticated. Printers were no longer senior figures — they became blue-collar workers. And the executive suite no longer interacted with them much, except during contract negotiations.

This might have been nothing more than a previously hard job becoming easier, Hallelujah. But most print companies took it further. Talking to the people who understood the technology became demeaning, something to be avoided. Information was to move from management to workers, not vice-versa (a pattern that later came to other kinds of media businesses as well.) By the time the web came around and understanding the technology mattered again, many media executives hadn’t just lost the habit of talking with their own technically adept employees, they’d actively suppressed it.

I’d long forgotten about that meeting and those looks of contempt (I stopped building websites before most people started) until the launch of Healthcare.gov.

* * *

For the first couple of weeks after the launch, I assumed any difficulties in the Federal insurance market were caused by unexpected early interest, and that once the initial crush ebbed, all would be well. The sinking feeling that all would not be well started with this disillusioning paragraph about what had happened when a staff member at the Centers for Medicare & Medicaid Services, the department responsible for Healthcare.gov, warned about difficulties with the site back in March.

In response, his superiors told him…

[...] in effect, that failure was not an option, according to people who have spoken with him. Nor was rolling out the system in stages or on a smaller scale, as companies like Google typically do so that problems can more easily and quietly be fixed. Former government officials say the White House, which was calling the shots, feared that any backtracking would further embolden Republican critics who were trying to repeal the health care law.

The idea that “failure is not an option” is a fantasy version of how non-engineers should motivate engineers. That sentiment was invented by a screenwriter, riffing on an after-the-fact observation about Apollo 13; no one actually said it at the time. (If you ever say it, wash your mouth out with soap. If anyone ever says it to you, run.) Even NASA’s vaunted moonshot, so often referred to as the best of government innovation, tested with dozens of unmanned missions first, several of which failed outright.

Failure is always an option. Engineers work as hard as they do because they understand the risk of failure. And for anything it might have meant in its screenplay version, here that sentiment means the opposite; the unnamed executives were saying “Addressing the possibility of failure is not an option.”

* * *

The management question, when trying anything new, is “When does reality trump planning?” For the officials overseeing Healthcare.gov, the preferred answer was “Never.” Every time there was a chance to create some sort of public experimentation, or even just some clarity about its methods and goals, the imperative was to avoid giving the opposition anything to criticize.

At the time, this probably seemed like a way of avoiding early failures. But the project’s managers weren’t avoiding those failures. They were saving them up. The actual site is worse—far worse—for not having early and aggressive testing. Even accepting the crassest possible political rationale for denying opponents a target, avoiding all public review before launch has given those opponents more to complain about than any amount of ongoing trial and error would have.

In his most recent press conference about the problems with the site, the President ruefully compared his campaigns’ use of technology with Healthcare.gov:

And I think it’s fair to say that we have a pretty good track record of working with folks on technology and IT from our campaign, where, both in 2008 and 2012, we did a pretty darn good job on that. [...] If you’re doing it at the federal government level, you know, you’re going through, you know, 40 pages of specs and this and that and the other and there’s all kinds of law involved. And it makes it more difficult — it’s part of the reason why chronically federal IT programs are over budget, behind schedule.

It’s certainly true that Federal IT is chronically challenged by its own processes. But the biggest problem with Healthcare.gov was not timeline or budget. The biggest problem was that the site did not work, and the administration decided to launch it anyway.

This is not just a hiring problem, or a procurement problem. This is a management problem, and a cultural problem. The preferred method for implementing large technology projects in Washington is to write the plans up front, break them into increasingly detailed specifications, then build what the specifications call for. It’s often called the waterfall method, because on a timeline the project cascades from planning, at the top left of the chart, down to implementation, on the bottom right.

Like all organizational models, waterfall is mainly a theory of collaboration. By putting the most serious planning at the beginning, with subsequent work derived from the plan, the waterfall method amounts to a pledge by all parties not to learn anything while doing the actual work. Instead, waterfall insists that the participants will understand best how things should work before accumulating any real-world experience, and that planners will always know more than workers.

This is a perfect fit for a culture that communicates in the deontic language of legislation. It is also a dreadful way to make new technology. If there is no room for learning by doing, early mistakes will resist correction. If the people with real technical knowledge can’t deliver bad news up the chain, potential failures get embedded rather than uprooted as the work goes on.

At the same press conference, the President also noted the degree to which he had been kept in the dark:

OK. On the website, I was not informed directly that the website would not be working the way it was supposed to. Had I been informed, I wouldn’t be going out saying “Boy, this is going to be great.” You know, I’m accused of a lot of things, but I don’t think I’m stupid enough to go around saying, this is going to be like shopping on Amazon or Travelocity, a week before the website opens, if I thought that it wasn’t going to work.

Healthcare.gov is a half-billion dollar site that was unable to complete even a thousand enrollments a day at launch, and for weeks afterwards. As we now know, programmers, stakeholders, and testers all expressed reservations about Healthcare.gov’s ability to do what it was supposed to do. Yet no one who understood the problems was able to tell the President. Worse, every senior political figure—every one—who could have bridged the gap between knowledgeable employees and the President decided not to.

And so it was that, even on launch day, the President was allowed to make things worse for himself and his signature program by bragging about the already-failing site and inviting people to log in and use something that mostly wouldn’t work. Whatever happens to government procurement or hiring (and we should all hope those things get better) a culture that prefers deluding the boss over delivering bad news isn’t well equipped to try new things.

* * *

With a site this complex, things were never going to work perfectly the first day, whatever management thought they were procuring. Yet none of the engineers with a grasp of this particular reality could successfully convince the political appointees to adopt the obvious response: “Since the site won’t work for everyone anyway, let’s decide what tests to run on the initial uses we can support, and use what we learn to improve.

In this context, testing does not just mean “Checking to see what works and what doesn’t.” Even the Healthcare.gov team did some testing; it was late and desultory, but at least it was there. (The testers recommended delaying launch until the problems were fixed. This did not happen.) Testing means seeing what works and what doesn’t, and acting on that knowledge, even if that means contradicting management’s deeply held assumptions or goals. In well run organizations, information runs from the top down and from the bottom up.

One of the great descriptions of what real testing looks like comes from Valve software, in a piece detailing the making of its game Half-Life. After designing a game that was only sort of good, the team at Valve revamped its process, including constant testing:

This [testing] was also a sure way to settle any design arguments. It became obvious that any personal opinion you had given really didn’t mean anything, at least not until the next test. Just because you were sure something was going to be fun didn’t make it so; the testers could still show up and demonstrate just how wrong you really were.

Any personal opinion you had given really didn’t mean anything.” So it is in the government; any insistence that something must work is worthless if it actually doesn’t.

An effective test is an exercise in humility; it’s only useful in a culture where desirability is not confused with likelihood. For a test to change things, everyone has to understand that their opinion, and their boss’s opinion, matters less than what actually works and what doesn’t. (An organization that isn’t learning from its users has decided it doesn’t want to learn from its users.)

Given comparisons with technological success from private organizations, a common response is that the government has special constraints, and thus cannot develop projects piecemeal, test with citizens, or learn from its mistakes in public. I was up at the Kennedy School a month after the launch, talking about technical leadership and Healthcare.gov, when one of the audience members made just this point, proposing that the difficult launch was unavoidable, because the government simply couldn’t have tested bits of the project over time.

That observation illustrates the gulf between planning and reality in political circles. It is hard for policy people to imagine that Healthcare.gov could have had a phased rollout, even while it is having one.

At launch, on October 1, only a tiny fraction of potential users could actually try the service. They generated concrete errors. Those errors were handed to a team whose job was to improve the site, already public but only partially working. The resulting improvements are incremental, and put in place over a period of months. That is a phased rollout, just one conducted in the worst possible way.

The vision of “technology” as something you can buy according to a plan, then have delivered as if it were coming off a truck, flatters and relieves managers who have no idea and no interest in how this stuff works, but it’s also a breeding ground for disaster. The mismatch between technical competence and executive authority is at least as bad in government now as it was in media companies in the 1990s, but with much more at stake.

* * *

Tom Steinberg, in his remembrance of his brilliant colleague Chris Lightfoot, said this about Lightfoot’s view of government and technology:

What he fundamentally had right was the understanding that you could no longer run a country properly if the elites don’t understand technology in the same way they grasp economics or ideology or propaganda. His analysis and predictions about what would happens if elites couldn’t learn were savage and depressingly accurate.

Now, and from now on, government will interact with its citizens via the internet, in increasingly important ways. This is a non-partisan issue; whichever party is in the White House will build and launch new forms of public service online. Unfortunately for us, our senior political figures have little habit of talking to their own technically adept employees.

If I had to design a litmus test for whether our political class grasps the internet, I would look for just one signal: Can anyone with authority over a new project articulate the tradeoff between features, quality, and time?

When a project cannot meet all three goals—a situation Healthcare.gov was clearly in by March—something will give. If you want certain features at a certain level of quality, you’d better be able to move the deadline. If you want overall quality by a certain deadline, you’d better be able to simplify, delay, or drop features. And if you have a fixed feature list and deadline, quality will suffer.

Intoning “Failure is not an option” will be at best useless, and at worst harmful. 

There is no “Suddenly Go Faster” button, no way you can throw in money or additional developers as a late-stage accelerant; money is not directly tradable for either quality or speed, and adding more programmers to a late project makes it later. You can slip deadlines, reduce features, or, as a last resort, just launch and see what breaks.

Denying this tradeoff doesn’t prevent it from happening. If no one with authority over the project understands that, the tradeoff is likely to mean sacrificing quality by default. That just happened to this administration’s signature policy goal. It will happen again, as long politicians can be allowed to imagine that if you just plan hard enough, you can ignore reality. It will happen again, as long as department heads imagine that complex technology can be procured like pencils. It will happen again as long as management regards listening to the people who understand the technology as a distasteful act.

Source:
http://www.shirky.com/weblog/2013/11/healthcare-gov-and-the-gulf-between-planning-and-reality
 ______________________

Saturday, November 23, 2013

US and UK drop in Web rankings for freedom and openness in 2013 Web Index

Insidious government surveillance may be worse than outright censorship

By Tim Berners-Lee
Ars Technica
Nov. 22, 2013

The insidious nature of government spying has a chilling and subtle effect on Web freedoms that could ultimately be more damaging to society than outright censorship, World Wide Web creator Tim Berners-Lee told the audience at the launch of the World Wide Web Foundation's 2013 Web Index findings.
The legacy of the revelations made by whistleblower Edward Snowden into the actions of national security agencies (the NSA, GCHQ, and others) will be long-lasting, intimated Berners-Lee. While nations such as China openly engage in censorship, what the US and the UK have done could potentially leave a trail of paranoia that in turn leads to a trend for self-censorship among citizens of the allegedly "free" West.
"The question of 'who is it that's got the off switch for our connectivity' started to be asked because of Egypt," said Berners-Lee. "It's a rather obvious thing you can see happening, and a country that does that doesn't get very far. Turning off the Internet got the youths onto the streets because that's what they had left to do. So blocking of the Internet is kind of obvious. And censorship in places like China is obvious too when it comes to blocking whole websites. It's hard to pretend it doesn't exist when the rest of the Web has links to those websites."
"But spying is this insidious form, because of its chilling effect if you feel someone's looking over your shoulder, there's all kinds of things you will not do… [You're not going to be] able to use facilities because of nameless fear."
This year's Web Index was finished in September, so Berners-Lee suggests some countries may have ranked even lower considering the revelations of the past few months. The US and the UK predictably fell a few spots down the table on the sub-index Freedom and Openness. Though the UK came out third overall, one spot ahead of the US, it came 24th for Freedom and Openness. It came eighth for Universal Access, first for Relevant Content, and third for Empowerment.
It shows the contradiction that can exist between the public's perceived freedoms online, and the government's control of those freedoms. For instance, while the US is first for Empowerment—defined as "how far the Web is empowering people not just to receive information, but to voice their own views, participate in public affairs, and take action to improve their lives"—it came 27th for Freedom and Openness. Citizen empowerment and public engagement is concealing the darker underbelly of wider-spread abuses of personal privacy. "Provisions against cybercrime, terrorism, or blasphemy are frequently being employed to silence legitimate dissent or justify blanket digital surveillance," explains the report.
Anonymity is a key area where the state's agenda directly clashes with concepts of freedom, suggested Berners-Lee, and it will be a complex issue to solve.
"Some things are good, like the openness of government data, and some things are just bad. But anonymity is one where it's not so simple. NGOs that work under oppressive regimes and are in contact with the underground campaign for it. Then we have people dealing with cyber bullying where clearly if someone's saying nasty, mean untrue things about you then you can reveal who they are."
In any situation where users say they need a secondary identity, Berners-Lee says we need "a whole social system and machinery" around that service to protect others. Communities need to be self-monitoring so that anonymity can legitimately exist where necessary—as with whistleblower Edward Snowden.

Mass surveillance for everyone!

The revelations of the US and UK's mass surveillance of their own citizens could also have a knock-on effect among those countries employing more obvious means of censorship.
"In my work, I campaign in countries in a diplomatic way to explain why it's important for their economy and future to have freedom of expression," said Wikipedia founder Jimmy Wales, also at the Web Index launch. "Wikipedia provides a great platform for doing that, and in countries where it's blocked or restricted, it's hard to say we only block terrorists or pedophiles—it's not exactly a crazy thing. But when I'm sitting down with the state council information officer in China or Kazakhstan and saying to them, 'you're on the wrong side of history. This is not the approach that will be here with us in 20 years and it will look bad,' it's important to be able to point to the UK and US. And when we're no longer able to do that, it rings a little hollow to say 'you shouldn't be spying.'"
As such, Wales said it's important we as citizens voice our concerns before that mentality of entitlement makes matters worse. "We may trust GCHQ not to disappear activists, but China may feel more justified and will disappear activists. That's something we should be concerned about."
"I have some real concerns about the direction the Web is taking."
Rebecca MacKinnon, author of Consent of the Networked and co-founder of Global Voices Online, reiterated that now is the time for citizens to empower themselves and ensure these matters are not left to governments alone, a point backed up by the fact that 94 percent of all countries surveyed in the Index fail to meet "best practice standards for checks and balances on government interception of electronic communications."
"Trust no one unless you can hold them accountable," she said. "In China, where they block international sites heavily and control data tightly, they're saying from a national security point of view, we did exactly the right thing."
"There's potential for people to assert more localized power over how the Web is developing in their own countries. If that drive is coming from the government claiming to be acting on behalf of government, it's an excuse for more control and surveillance. The role of citizens in insisting on accountability is key."
Bright Simons, president of the mPedigree Network, said he doesn't believe trust has been irrevocably broken by the NSA-GCHQ revelations, mainly because people weren't all that trusting beforehand. "It's leading to a backlash of fascinating proportions. People are going to make a choice. There is awareness among citizens that it's an issue to be concerned about, and a greater level of citizen empowerment is needed. It's an important milestone in citizen digital rights."
Berners-Lee suggested that empowerment should be facilitated by the destruction of the gulf that exists between the technology world and the policy world—our technology pioneers should not be afraid to engage in political matters, and policymakers need to understand the tech when drafting relevant legislation.
"I spend a lot of time encouraging people to program, not just because we need some people to understand technology—we need lawyers and those in parliament to understand it, otherwise they're not going to be able to make appropriate steps." We need those with the technological knowledge to be writing policy as well as standards in protocols.
Despite the air of pessimism surrounding the Web Index 2013 launch in light of the state spying controversies, Berners-Lee remained positive about the many good things that are happening around the globe. According to the report, the Internet remains vital in catalyzing citizen action and real world change. Despite the fact that 30 percent of nations engage in targeted Web censorship and "moderate to extensive blocking or filtering of politically sensitive content," the Web and social media played a big role in "public mobilization" in 80 percent of nations.
"This is not being spearheaded by political parties and NGOs," said Anne Jellema, CEO of the World Wide Web Foundation. "It's spontaneous and grassroots action driven by social media." The Philippines, for instance, came 20 places higher in the table than its GDP per capita. This, she said, is because those connected are "active and creative users of the Web," as demonstrated last year when they fought back against a proposed cybercrime bill and ultimately crowd sourced its replacement, a bill of rights for the Internet. It demonstrates that wealth does not necessarily correlate to a free and open Web practice. In fact some of the world's wealthiest nations fell far short of the Web Index's standards. Saudi Arabia, for instance, lingered near the bottom of the table for all sub-indexes.
"I am optimistic," said Berners-Lee. "I think the people will win. I have faith in people and humanity as a whole. There's going to be some push back, but change will come in lots of different ways—from activism, but also UN resolutions. Also from within government. There are people that care about this stuff."
This story originally appeared on Wired UK.
Source:
http://arstechnica.com/tech-policy/2013/11/berners-lee-insidious-government-surveillance-may-be-worse-than-outright-censorship/
 ___________________

Friday, November 22, 2013

JFK Assasination Anniversary

The CIA And Lee Harvey Oswald - Questions Remain 

By Joseph Lazzaro 
International Business Times 
November 15 2013

When analyzing the assassination of President John F. Kennedy -- a murder that changed the trajectory of U.S. public policy, both foreign and domestic -- it is reasonable to ask questions because a mystery remains at the center of this case.

In other words, the American people most likely did not get the truth -- or at least the full truth -- from the woefully deficient and inadequate Warren Commission, the House Select Committee on Assassinations (HSCA) or the Assassinations Records Review Board (ARRB).
The reason? At the top of the list has been the obstruction and deception exhibited by the Central Intelligence Agency -- which hindered investigators on each panel, failed to tell the full truth, did not disclose key allegiances, and/or engaged in other acts that prevented each board from undertaking a comprehensive investigation.
For proof of that obstruction and deception, one need not go any further than G. Robert Blakey, staff director and chief counsel for the House Select Committee on Assassinations, 1977-1979
In 2003, Blakey said it best regarding the Central Intelligence Agency’s conduct and policies during the HSCA’s investigation into the Nov. 22, 1963, assassination of President John F. Kennedy: 
... I no longer believe that we were able to conduct an appropriate investigation of the [Central Intelligence] Agency and its relationship to Oswald ... We now know that the Agency withheld from the Warren Commission the CIA-Mafia plots to kill Castro. Had the commission known of the plots, it would have followed a different path in its investigation. The Agency unilaterally deprived the commission of a chance to obtain the full truth, which will now never be known. Significantly, the Warren Commission's conclusion that the agencies of the government co-operated with it is, in retrospect, not the truth. We also now know that the Agency set up a process that could only have been designed to frustrate the ability of the committee in 1976-79 to obtain any information that might adversely affect the Agency. Many have told me that the culture of the Agency is one of prevarication and dissimulation and that you cannot trust it or its people. Period. End of story. I am now in that camp.

But Don’t Jump To A Conclusion
At the same time, one should not jump to the conclusion that President Kennedy was murdered in a plot/conspiracy coordinated by the CIA, perhaps with the help of organized crime and anti-Castro Cuban rebels, as many have suggested. First, at the present time, there simply isn’t enough hard evidence to incontrovertibly prove a CIA-led plot.
Second, and equally significant, there are a series of scenarios -- gross incompetence, obstruction of justice or criminal negligence -- that may ultimately turn out to be closer to what really happened in Dealey Plaza in Dallas on Nov. 22, 1963 -- once still-classified JFK assassination files are made public by the CIA.
Specifically, those scenarios are: 
a) Classified, non-public information that, if released, would reflect adversely on the CIA.
b) Information that indicates CIA agents, contractors, assets or informers made enormous and unconscionable mistakes in monitoring accused assassin Lee Harvey Oswald’s file following his return from Russia (then the Soviet Union), including the failure to properly monitor his presence in Dallas and alert all law enforcement authorities of the threat he posed to public officials.
c) The failure to discipline and remove from intelligence work Agency extremists in the CIA who went rogue and then planned and implemented their own assassination operation, with or without Oswald.
In other words, as your fifth-grade elementary school teacher taught you, “Don’t jump to a conclusion without enough evidence”  -- of a plot/conspiracy.
Reasonable To Doubt Lone-Gunman Theory
At the same time, it is not unreasonable to doubt the Warren Commission’s conclusion that Oswald, acting alone, fired three shots from the sixth floor of the Texas School Book Depository building in Dealey Plaza in Dallas and assassinated President Kennedy, while also wounding Texas Gov. John Connally and one bystander. Many assassination researchers reject the Warren Commission’s conclusion, arguing that through omission and/or commission, the committee’s investigation was deeply flawed.
The problem is, as noted, there’s still not enough hard evidence to determine what really happened in Dallas on Nov. 22, 1963, and it is that gap -- including the failure to make public all classified U.S. government documents related to the case -- that has led to a condition in which it’s both difficult to accept the Warren Commission’s conclusion and report -- its incompleteness is one reason it is implausible -- or construct a better one.
And, again, the incompleteness of the Warren Commission’s investigation -- and the HSCA’s and ARRB’s for that matter -- speaks to the need to make public all still-classified JFK assassination files held by the CIA.
What’s more, not only do the American people not have the truth -- or at least the full truth -- regarding the attack in Dealey Plaza, the American people don’t even have the full truth on Lee Harvey Oswald, or on Oswald’s interactions with the CIA, or on how the CIA treated and handled Oswald’s file. 
Here’s a classic example of that information/data gap: In the course of author  and JFKFacts.org moderator Jefferson Morley’s lawsuit -- Morley v CIA -- which seeks the release of the classified records of CIA Undercover Officer George Joannides, who was chief of psychological warfare operations at the CIA’s Miami station, the CIA acknowledged in a sworn affidavit that the agency retains 1,100 records related to JFK’s assassination that have never been made public.
Specifically, classified records of key CIA officers/personnel William Harvey, David Phillips, Birch D. O’Neal, E Howard Hunt, Anne Goodpasture, David Sanchez Morales and the aforementioned George Joannides --  when made public -- will help the nation determine what really happened in Dallas, who Lee Harvey Oswald was, and how the CIA treated and handled his file.
According to the CIA, these files are “not believed relevant” to JFK’s death.
In an affidavit filed in federal court, the CIA asserted that the 1,100 documents must remain secret until at least October 2017, due to “national security.”
The 1964 Conclusion May Be Correct
Further, of course, if you’re someone who supports the Warren Commission’s conclusion, than as far as you're concerned, there's really no need to make public the CIA’s JFK assassination files: From that standpoint, the issue of who committed the most devastating murder in modern American history has been resolved. Moreover, it is entirely possible that Oswald murdered President Kennedy while acting alone -- and that he alone is responsible.
However, the reasoning forwarded here argues that the sheer weight of the anomalies, including the Warren Commission’s grossly slipshod collection of evidence -- failing to collect 100 percent of the evidence and failing to analyze evidence -- and numerous other violations of protocols for criminal investigations involving ballistic, forensic and autopsy evidence, plus the failure to obtain witness testimony and other serious violations -- combined with the analyses of other researchers, makes that scenario unlikely.
In Dealey Plaza, It Is Always Nov. 22, 1963
Since the assassination of President Kennedy, there has never been a poll in which a majority of Americans believed Lee Harvey Oswald acted alone and fired three rifle shots from the Texas School Book Depository building. Are the American people thinking incorrectly, even irrationally? Do the American people have it wrong? Travel to Dealey Plaza in Dallas, and instead of visiting the TSBD’s Sixth Floor, stand on the sidewalk on the north side of Elm Street where the presidential motorcade approached the grassy knoll.
Stand there for 10, 20, 30 minutes. Soak in the environment and location. Look at the grassy knoll, turn east to the Dal-Tex Building, then west toward the triple overpass above the Stemmons Freeway, and then back again toward the grassy knoll. If you’re like many Americans, your inner sense, your intuition, that resonance to the depth of your being, is correct. You’re correct in concluding that the American people have not received the truth -- or at least the full truth -- regarding Nov. 22, 1963.
But what that full truth is will not be known until the CIA makes public all JFK files.
 Source:
 http://www.ibtimes.com/jfk-assassination-cia-lee-harvey-oswald-questions-remain-1472050
________________________

Friday, November 08, 2013

A Fifth of Americans Go Hungry

An August 2012 Gallup poll showed that 18.2 percent of Americans lacked sufficient money for needed food at least once over the previous year. To make matters worse, the worst drought in half a century impacted 80 percent of agricultural lands in the country, increasing food prices. Despite this, in 2012, Congress considered cutting support for Supplemental Nutrition Assistance Program (SNAP)— the official name of its food stamp program—as part of the 2013 Farm Bill.
Proposed Senate cuts would cost approximately 500,000 households about ninety dollars a month in nutritional assistance. Proposed cuts in the House of Representatives would go much further than the ones in the Senate, and would have removed at least 1.8 million people from SNAP. Republicans controlling the House have been eager to cut spending and were the primary supporters of food stamp cuts.
Opponents have expressed concern over the harm the cuts would cause to society’s more vulnerable members, including seniors, children, and working families. Rising food prices would hit Southern states the hardest, while Mountain-Plains and Midwest states would be least affected. Despite all the food hardship, the National Resources Defense Council reported that 40 percent of food in the country goes to waste.
Sources:
Mike Ludwig, “Millions Go Hungry as Congress Considers Food Stamp Cuts and Drought Threatens Crops,” Truthout, August 23, 2012, http://truth-out.org/news/item/11067-millions-go-hungry-as-congress-considers-food-stamp-cuts-and-drought-threatens-crops.
Student Researcher: Noah Tenney (Sonoma State University) Faculty Evaluator: Andy Lee Roth (Sonoma State University)