Friday, October 17, 2014


"Slate" rediscovers IQ -- though they dare not to call it that

They recoil with horror about applying the findings to intergroup differences however, and claim without explanation that what is true of individuals cannot be true of groups of individuals.  That is at least counterintuitive.  They even claim that there is no evidence of IQ differences between groups being predictive of anything.

I suppose that one has to pity their political correctness, however, because the thing they are greatly at pains to avoid -- the black-white IQ gap -- is superb validation of the fact that group differences in IQ DO matter.  From their abysmal average IQ score, we we would predict that blacks would be at the bottom of every heap (income, education, crime etc.)  -- and that is exactly where they are.  Clearly, group differences in IQ DO matter and the IQ tests are an excellent and valid measure of them



We are not all created equal where our genes and abilities are concerned.

A decade ago, Magnus Carlsen, who at the time was only 13 years old, created a sensation in the chess world when he defeated former world champion Anatoly Karpov at a chess tournament in Reykjavik, Iceland, and the next day played then-top-rated Garry Kasparov—who is widely regarded as the best chess player of all time—to a draw. Carlsen’s subsequent rise to chess stardom was meteoric: grandmaster status later in 2004; a share of first place in the Norwegian Chess Championship in 2006; youngest player ever to reach World No. 1 in 2010; and highest-rated player in history in 2012.

What explains this sort of spectacular success? What makes someone rise to the top in music, games, sports, business, or science? This question is the subject of one of psychology’s oldest debates. In the late 1800s, Francis Galton—founder of the scientific study of intelligence and a cousin of Charles Darwin—analyzed the genealogical records of hundreds of scholars, artists, musicians, and other professionals and found that greatness tends to run in families. For example, he counted more than 20 eminent musicians in the Bach family. (Johann Sebastian was just the most famous.) Galton concluded that experts are “born.” Nearly half a century later, the behaviorist John Watson countered that experts are “made” when he famously guaranteed that he could take any infant at random and “train him to become any type of specialist [he] might select—doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents.”

The experts-are-made view has dominated the discussion in recent decades. In a pivotal 1993 article published in Psychological Review—psychology’s most prestigious journal—the Swedish psychologist K. Anders Ericsson and his colleagues proposed that performance differences across people in domains such as music and chess largely reflect differences in the amount of time people have spent engaging in “deliberate practice,” or training exercises specifically designed to improve performance. To test this idea, Ericsson and colleagues recruited violinists from an elite Berlin music academy and asked them to estimate the amount of time per week they had devoted to deliberate practice for each year of their musical careers. The major finding of the study was that the most accomplished musicians had accumulated the most hours of deliberate practice. For example, the average for elite violinists was about 10,000 hours, compared with only about 5,000 hours for the least accomplished group. In a second study, the difference for pianists was even greater—an average of more than 10,000 hours for experts compared with only about 2,000 hours for amateurs. Based on these findings, Ericsson and colleagues argued that prolonged effort, not innate talent, explained differences between experts and novices.

These findings filtered their way into pop culture. They were the inspiration for what Malcolm Gladwell termed the “10,000 Hour Rule” in his book Outliers, which in turn was the inspiration for the song “Ten Thousand Hours” by the hip-hop duo Macklemore and Ryan Lewis, the opening track on their Grammy-award winning album The Heist. However, recent research has demonstrated that deliberate practice, while undeniably important, is only one piece of the expertise puzzle—and not necessarily the biggest piece. In the first study to convincingly make this point, the cognitive psychologists Fernand Gobet and Guillermo Campitelli found that chess players differed greatly in the amount of deliberate practice they needed to reach a given skill level in chess. For example, the number of hours of deliberate practice to first reach “master” status (a very high level of skill) ranged from 728 hours to 16,120 hours. This means that one player needed 22 times more deliberate practice than another player to become a master.            

A recent meta-analysis by Case Western Reserve University psychologist Brooke Macnamara and her colleagues (including the first author of this article for Slate) came to the same conclusion. We searched through more than 9,000 potentially relevant publications and ultimately identified 88 studies that collected measures of activities interpretable as deliberate practice and reported their relationships to corresponding measures of skill. (Analyzing a set of studies can reveal an average correlation between two variables that is statistically more precise than the result of any individual study.) With very few exceptions, deliberate practice correlated positively with skill. In other words, people who reported practicing a lot tended to perform better than those who reported practicing less. But the correlations were far from perfect: Deliberate practice left more of the variation in skill unexplained than it explained. For example, deliberate practice explained 26 percent of the variation for games such as chess, 21 percent for music, and 18 percent for sports. So, deliberate practice did not explain all, nearly all, or even most of the performance variation in these fields. In concrete terms, what this evidence means is that racking up a lot of deliberate practice is no guarantee that you’ll become an expert. Other factors matter.

What are these other factors? There are undoubtedly many. One may be the age at which a person starts an activity. In their study, Gobet and Campitelli found that chess players who started playing early reached higher levels of skill as adults than players who started later, even after taking into account the fact that the early starters had accumulated more deliberate practice than the later starters. There may be a critical window during childhood for acquiring certain complex skills, just as there seems to be for language.

There is now compelling evidence that genes matter for success, too. In a study led by the King’s College London psychologist Robert Plomin, more than 15,000 twins in the United Kingdom were identified through birth records and recruited to perform a battery of tests and questionnaires, including a test of drawing ability in which the children were asked to sketch a person. In a recently published analysis of the data, researchers found that there was a stronger correspondence in drawing ability for the identical twins than for the fraternal twins. In other words, if one identical twin was good at drawing, it was quite likely that his or her identical sibling was, too. Because identical twins share 100 percent of their genes, whereas fraternal twins share only 50 percent on average, this finding indicates that differences across people in basic artistic ability are in part due to genes. In a separate study based on this U.K. sample, well over half of the variation between expert and less skilled readers was found to be due to genes.

In another study, a team of researchers at the Karolinska Institute in Sweden led by psychologist Miriam Mosing had more than 10,000 twins estimate the amount of time they had devoted to music practice and complete tests of basic music abilities, such as determining whether two melodies carry the same rhythm. The surprising discovery of this study was that although the music abilities were influenced by genes—to the tune of about 38 percent, on average—there was no evidence they were influenced by practice. For a pair of identical twins, the twin who practiced music more did not do better on the tests than the twin who practiced less. This finding does not imply that there is no point in practicing if you want to become a musician. The sort of abilities captured by the tests used in this study aren’t the only things necessary for playing music at a high level; things such as being able to read music, finger a keyboard, and commit music to memory also matter, and they require practice. But it does imply that there are limits on the transformative power of practice. As Mosing and her colleagues concluded, practice does not make perfect.

Along the same lines, biologist Michael Lombardo and psychologist Robert Deaner examined the biographies of male and female Olympic sprinters such as Jesse Owens, Marion Jones, and Usain Bolt, and found that, in all cases, they were exceptional compared with their competitors from the very start of their sprinting careers—before they had accumulated much more practice than their peers.

What all of this evidence indicates is that we are not created equal where our abilities are concerned. This conclusion might make you uncomfortable, and understandably so. Throughout history, so much wrong has been done in the name of false beliefs about genetic inequality between different groups of people—males vs. females, blacks vs. whites, and so on. War, slavery, and genocide are the most horrifying examples of the dangers of such beliefs, and there are countless others. In the United States, women were denied the right to vote until 1920 because too many people believed that women were constitutionally incapable of good judgment; in some countries, such as Saudi Arabia, they still are believed to be. Ever since John Locke laid the groundwork for the Enlightenment by proposing that we are born as tabula rasa—blank slates—the idea that we are created equal has been the central tenet of the “modern” worldview. Enshrined as it is in the Declaration of Independence as a “self-evident truth,” this idea has special significance for Americans. Indeed, it is the cornerstone of the American dream—the belief that anyone can become anything they want with enough determination.

It is therefore crucial to differentiate between the influence of genes on differences in abilities across individuals and the influence of genes on differences across groups. The former has been established beyond any reasonable doubt by decades of research in a number of fields, including psychology, biology, and behavioral genetics. There is now an overwhelming scientific consensus that genes contribute to individual differences in abilities. The latter has never been established, and any claim to the contrary is simply false.

Another reason the idea of genetic inequality might make you uncomfortable is because it raises the specter of an anti-meritocratic society in which benefits such as good educations and high-paying jobs go to people who happen to be born with “good” genes. As the technology of genotyping progresses, it is not far-fetched to think that we will all one day have information about our genetic makeup, and that others—physicians, law enforcement, even employers or insurance companies—may have access to this information and use it to make decisions that profoundly affect our lives. However, this concern conflates scientific evidence with how that evidence might be used—which is to say that information about genetic diversity can just as easily be used for good as for ill.

Take the example of intelligence, as measured by IQ. We know from many decades of research in behavioral genetics that about half of the variation across people in IQ is due to genes. Among many other outcomes, IQ predicts success in school, and so once we have identified specific genes that account for individual differences in IQ, this information could be used to identify, at birth, children with the greatest genetic potential for academic success and channel them into the best schools. This would probably create a society even more unequal than the one we have. But this information could just as easily be used to identify children with the least genetic potential for academic success and channel them into the best schools. This would probably create a more equal society than the one we have, and it would do so by identifying those who are likely to face learning challenges and provide them with the support they might need. Science and policy are two different things, and when we dismiss the former because we assume it will influence the latter in a particular and pernicious way, we limit the good that can be done.

Wouldn’t it be better to just act as if we are equal, evidence to the contrary notwithstanding? That way, no people will be discouraged from chasing their dreams—competing in the Olympics or performing at Carnegie Hall or winning a Nobel Prize. The answer is no, for two reasons. The first is that failure is costly, both to society and to individuals. Pretending that all people are equal in their abilities will not change the fact that a person with an average IQ is unlikely to become a theoretical physicist, or the fact that a person with a low level of music ability is unlikely to become a concert pianist. It makes more sense to pay attention to people’s abilities and their likelihood of achieving certain goals, so people can make good decisions about the goals they want to spend their time, money, and energy pursuing. Moreover, genes influence not only our abilities, but the environments we create for ourselves and the activities we prefer—a phenomenon known as gene-environment correlation. For example, yet another recent twin study (and the Karolinska Institute study) found that there was a genetic influence on practicing music. Pushing someone into a career for which he or she is genetically unsuited will likely not work.

SOURCE

*****************************

Hugh Hewitt: "I Cannot Believe This Is Happening In America"

Hugh Hewitt on the Hugh Hewitt show on Tuesday afternoon announced some breaking news from Fox News' Todd Starnes of an outrageous action happening in my city of Houston, initiated by our Mayor Annise Parker.  This is the first that I had heard of this and it was confirmed to me by my wife when she came home from work, as she already was aware of the news and was outraged like Hugh.

Hugh read the story from Fox News web site, "The city of Houston has issued subpoenas demanding that a group of pastors turn over any sermons dealing with homosexuality, gender identity, or criticism about Annise Parker."   "Any failure to reply to the subpoenas could mean that the ministers will be held in contempt of court."  It must be noted that Annise Parker is the city's first openly lesbian mayor.



This is an unbelievable outrageous attack on freedom of speech and religion.

Turn over any criticism of the mayor?  What?  Is this United States of America or Castro's Cuba or the old Soviet Union or as Hugh quipped the old KGB Vladimir Putin.

I agree totally with Hugh who gave a strong necessary rant on this unconstitutional attack by our mayor as he rightly called her that "idiot" mayor.  Hugh said "I cannot believe this is happening in America."  I agree with Hugh, with one caveat.  I would add, before the presidency of Barack Obama, I could not envision this ever happening in America.

Thankfully there are great Americans like Hugh's good friend, Tony Perkins of the Family Research Council, and the Alliance For Freedom' lawyers who are in support of the pastors opposing this unconstitutional action by our mayor.  I ask, as did Hugh Hewitt for everyone to go to their web site here, and donate money to help them defend our freedom in this case and  against similar cases of attacks on religion and freedom of speech.

SOURCE

After publicity about her Stalinist action, the bitch has now backed down

****************************



****************************

For more blog postings from me, see  TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, GREENIE WATCH,  POLITICAL CORRECTNESS WATCH, AUSTRALIAN POLITICS, and Paralipomena (Occasionally updated) and Coral reef compendium. (Updated as news items come in).  GUN WATCH is now mainly put together by Dean Weingarten.

List of backup or "mirror" sites here or  here -- for when blogspot is "down" or failing to  update.  Email me  here (Hotmail address). My Home Pages are here (Academic) or  here (Pictorial) or  here  (Personal)

****************************

Thursday, October 16, 2014


The FBI’s bogus report on mass shootings

With the FBI too now lying, the Obama Left have driven  integrity out of most of the U.S. public service.  Lies are the stock in trade of the Left.  Reality is too inconvenient for them. Soviet disinformation was notorious and now we have Obama disinformation

It’s disheartening to see the FBI used to promote a political agenda, but that’s what we got with the bureau’s release last month of a study claiming to show a sharp rise in mass shootings, a la Newtown, Conn.

The FBI counted 160 “mass” or “active” shootings in public places from 2000 to 2013. Worse, it said these attacks rose from just one in 2000 to 17 in 2013. Media outlets worldwide gave the “news” extensive coverage.

Too bad the study is remarkably shoddy — slicing the evidence to distort the results. In fact, mass public shootings have only risen ever so slightly over the last four decades.

While the FBI study discusses “mass shootings or killings,” its graphs were filled with cases that had nothing to do with mass killings. Of the 160 cases it counted, 32 involved a gun being fired without anyone being killed. Another 35 cases involved a single murder.

It’s hard to see how the FBI can count these incidents, which make up 42 percent of its 160 cases, as “mass killings.” They plainly don’t fit the FBI’s old definition, which required four or more murders, nor even its new one of at least three murders.

And these non-mass shootings, with zero or one person killed, drive much of the purported increase in the number of attacks. If you consider cases where no one or only one person was killed, 50 came in the last seven years of the period the FBI examined and only 17 during the first seven years.

For example, in 2010, the FBI reports that there were 29 of these active shooter cases, but just nine involved more than a single fatality.

The FBI study also ignored 20 out of what should have been a total of 113 cases where at least two people were killed.  For example, it missed a 2001 shooting at a Chicago bar that left two dead and 21 wounded, as well as a 2004 Columbus, Ohio, attack at a concert that left four dead.  Three-quarters of the missing cases came in the first half of the study’s time period, thus again biasing the results toward finding a larger increase over time.

Another trick was the choice of 2000 as the starting date. Everybody who has studied these attacks knows that 2000 and 2001 were unusually quiet years, with few mass shootings.  Thus, by starting with those years and padding the cases in later years with non-mass shooting attacks, the study’s authors knew perfectly well they would get the result they wanted.

The picture looks quite different if you use good data and a longer time period. Back in 2000, Bill Landes of the University of Chicago and I gathered data on mass public shootings from 1977 to 1999; I’ve now updated the database.  Our criteria were similar to what the FBI said it would follow: non-gang attacks in public places.

Shootings that were also part of some other crime, such as a robbery, were also excluded. But we counted cases where at least two people had been murdered in these public shootings.

Overall, there has been a slight increase in deaths from mass public shootings over these 38 years, but even then the upward trend largely depends on the single year 2012, when there were 91 deaths.

To be fair, the FBI study isn’t as shoddy as what Michael Bloom­berg’s Everytown has been pushing.  The group was greatly embarrassed after it first claimed that there had been 74 school shootings between the Newtown tragedy in December 2012 and the end of this past school year, but the true number of school attacks in which the shooter intended to commit mass murder turned out to be only a small fraction of that, just 10.

Similar to the FBI report, Bloom­berg’s group padded the numbers by classifying everything as a “Newtown type attack” — including when a Florida student defended himself with a gun from two attackers, a 40-year-old man committed suicide in a school parking lot at 2 a.m., and gang fights after hours.

But at least Bloom­berg is spending his own money to manufacture “evidence” to push his gun-control agenda. The politicization of the FBI and use of taxpayer dollars to scare Americans into supporting an agenda is far more disturbing.

SOURCE

*****************************

One in Five U.S. Residents Speaks Foreign Language at Home, Record 61.8 million

Spanish, Chinese, and Arabic speakers grew most since 2010



The Census Bureau recently released data from the 2013 American Community Survey (ACS), including languages spoken for those five years of age and older. The new data show that the number of people who speak a language other than English at home reached an all-time high of 61.8 million, up 2.2 million since 2010. The largest increases from 2010 to 2013 were for speakers of Spanish, Chinese, and Arabic. One in five U.S. residents now speaks a foreign language at home.  Among the findings:

    In 2013, a record 61.8 million U.S. residents (native-born, legal immigrants, and illegal immigrants) spoke a language other than English at home.

    The number of foreign-language speakers increased 2.2 million between 2010 and 2013. It has grown by nearly 15 million (32 percent) since 2000 and by almost 30 million since 1990 (94 percent).

    The largest increases 2010 to 2013 were for speakers of Spanish (up 1.4 million, 4 percent growth), Chinese (up 220,000, 8 percent growth), Arabic (up 188,000, 22 percent growth), and Urdu (up 50,000, 13 percent growth). Urdu is the national language of Pakistan.

    Languages with more than a million speakers in 2013 were Spanish (38.4 million), Chinese (three million), Tagalog (1.6 million), Vietnamese (1.4 million), French (1.3 million), and Korean and Arabic (1.1 million each). Tagalog is the national language of the Philippines.

    The percentage of the U.S. population speaking a language other than English at home was 21 percent in 2013, a slight increase over 2010. In 2000, the share was 18 percent; in 1990 it was 14 percent; it was 11 percent in 1980.

    Of the school-age (5 to 17) nationally, more than one in five speaks a foreign language at home. It is 44 percent in California and roughly one in three students in Texas, Nevada, and New York. But more surprisingly, it is now one in seven students in Georgia, North Carolina, Virginia, Nebraska and Delaware; and one out of eight students in Kansas, Utah, Minnesota, and Idaho.

    Many of those who speak a foreign language at home are not immigrants. Of the nearly 62 million foreign-language speakers, 44 percent (27.2 million) were born in the United States.1

    Of those who speak a foreign language at home, 25.1 million (41 percent) told the Census Bureau that they speak English less than very well.

    States with the largest share of foreign-language speakers in 2013 include: California, 45 percent; New Mexico, 36 percent; Texas 35 percent; New Jersey, 30 percent; Nevada, 30 percent; New York, 30 percent; Florida, 27 percent; Arizona, 27 percent; Hawaii, 25 percent; Illinois, 23 percent; Massachusetts, 22 percent; Connecticut, 22 percent; and Rhode Island, 21 percent.

    States with the largest percentage increases in foreign-language speakers 2010 to 2013 were: North Dakota, up 13 percent; Oklahoma, up 11 percent; Nevada, up 10 percent; New Hampshire, up 8 percent; Idaho, up 8 percent; Georgia, up 7 percent; Washington, up 7 percent; Oregon, up 6 percent; Massachusetts, up 6 percent; Kentucky, up 6 percent; Maryland, up 5 percent; and North Carolina, up 5 percent.

    Taking a longer view, states with the largest percentage increase in foreign-language speakers 2000 to 2013 were: Nevada, up 85 percent; North Carolina, up 69 percent; Georgia, up 69 percent; Washington, up 60 percent; South Carolina, up 57 percent; Virginia, up 57 percent; Tennessee, up 54 percent; Arkansas, up 54 percent; Maryland, up 52 percent; Delaware, up 52 percent; Oklahoma, up 48 percent; Utah, up 47 percent; Idaho, up 47 percent; Nebraska, up 46 percent; Florida, up 46 percent; Alabama, up 43 percent; Texas, up 42 percent; Oregon, up 42 percent; and Kentucky, up 39 percent.

Data Source. On September 18, the Census Bureau released some of the data from the 2013 ACS. The survey reflects the U.S. population as of July 1, 2013. The ACS is by far the largest survey taken by the federal government each year and includes over two million households.2 The Census Bureau has posted some of the results from the ACS to American FactFinder.3 It has not released the public-use version of the ACS for researchers to download and analyze. However a good deal of information can be found at FactFinder. Unless otherwise indicated, the information in this analysis comes directly from FactFinder.

There are three language questions in the ACS for 2010 and 2013. The first asks whether each person in the survey speaks a language other than English at home. The second, for those who answer "yes", asks what language the person speaks at home. The third asks how well the person speaks English. Only those who speak a language at home other than English are asked about their English skills. The 1980, 1990, and 2000 decennial censuses (long form) asked almost the exact same questions.

In this Backgrounder we report some statistics for the immigrant population, referred to as the foreign-born by the Census Bureau. The foreign-born are comprised of those individuals who were not U.S. citizens at birth. They include naturalized citizens, legal permanent residents (green card holders), temporary workers, and foreign students. They do not include those born to immigrants in the United States, including to illegal immigrant parents, nor do they include those born in outlying U.S. territories, such as Puerto Rico. Prior research by the Department of Homeland Security and others indicates that some 90 percent of illegal immigrants respond to the ACS.4

More HERE

*********************************

Time to stand back?

If Barack Obama had returned troops to Iraq six months ago, as his Administration and top military brass advised he must do, the Islamic State could have been all but destroyed. Deploying troops now is pointless as the coalition’s war planes have already run out of targets and IS fighters are vanishing into a civilian environment and preparing for a guerrilla war that neither side can win.

The part Shia 200,000-weak Iraqi army will either convert to Sunni or be slaughtered along with all the others.

Baghdad, a city of seven million, is surrounded and being pounded with suicide bombers. Coalition supplies will soon be cut as its airport is overrun. We are watching the equal of Napolean’s generalship as Iraq is ruthlessly carved up to form an Islamic caliphate with the deft precision of a surgeon’s scalpel, and at the hour the West was at its weakest.

Tactically this new war is already lost due to a President who dropped the ball and concentrated on golf, fundraisers and mid-term elections in three weeks’ time in which his Senate will almost certainly be lost to the not so sheepish Republicans.

Iraq, as a result of the coalition of the willing’s original invasion, will now cease to exist as Shia Iran licks its lips at the prospect of renewing its old war with a newly-formed Sunni State that’s ripe for the picking.

Mastermind of the ISIS caliphate, Abu Bakr al-Baghdadi, who has out shocked and awed the US, will become the new temporary Sunni Saddam Hussein, drawing new boundaries for Syria and, with Turkey’s assistance, destroying the Kurds’ hope for independence.

This war will be a benchmark for other Arab wars because al-Baghdadi has done it without an air force or navy and without spare parts or engineers for its stolen US tanks and military equipment, it has left the US breathless over its violent urgency.

The Iraqi caliphate will have a more sophisticated and determined enemy on either flank as the West rearranges its allies and caters for traditional foes, Damascus and Tehran, the only two forces with the will and the ability to contain the ISIS.

The real loser is Israel. It will now be faced with an emboldened nuclear Iran and an unsympathetic, weakened US that refuses to engage in another foreign loss.

Perhaps treacherous Turkey had the right idea; sit back and do nothing, just watch it happen. Perhaps Ankara realises that you can’t get only half involved in a war.

Anyway ISIS is doing exactly what Turkey and Saddam Hussein wanted to do, eliminate the Kurds.

The West has learnt a tough lesson in tribal Arab politics driven by manic versions of Islam and now it’s time to retreat and let the dust settle. No more decent lives should be lost fighting worthless Islamic pigs hiding in households.

The futility of encouraging Arab springs and trying to remove Arab war lords and tyrants is now clear... it was only they who held their fragile nations together.

We missed the opportunity to repel the ISIS, we were asleep, and now there is not a thing to be achieved by staying there. Bombing a few empty buildings and utes is simply not worth another public beheading of an innocent or promoting another homeland atrocity.

The West is spooked, the protected public is not used to seeing the reality of war on Utube. But this is actually how wars have always been fought and won, mass executions, rapes, beheadings, it happens and it’s shocking, yet it’s no different to any other war except that now it’s being documented in gory detail and distributed on the internet.

The protagonists want you to see them committing their atrocities, they are filming and editing them with professional production crews. It’s a masterstroke that has psychologically devastated the enemy’s will to fight. Only the under-armed Kurds have stood their ground.

The US hid the worst of its Vietnam war crimes as it threw its massive air power at an invisible enemy but in the end it needed to ignominiously escape from Saigon... it should now prepare to escape from Baghdad.

And when the dust does settle surely this time we will understand the folly of interfering in tribal Arab politics imbued with different versions of a stone-age Islam.

SOURCE

****************************

For more blog postings from me, see  TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, GREENIE WATCH,  POLITICAL CORRECTNESS WATCH, AUSTRALIAN POLITICS, and Paralipomena (Occasionally updated) and Coral reef compendium. (Updated as news items come in).  GUN WATCH is now mainly put together by Dean Weingarten.

List of backup or "mirror" sites here or  here -- for when blogspot is "down" or failing to  update.  Email me  here (Hotmail address). My Home Pages are here (Academic) or  here (Pictorial) or  here  (Personal)

****************************


Wednesday, October 15, 2014


Obamacare and the Aims of Progressivism

Greg Scandlen notes the provision of health care, historically, has been an arena populated in America by a host of civil society institutions. These institutions were purposefully displaced by government over the course of the past century:

These associations were formed by working class men and women from all ethnic groups. In some cases they owned and operated their own hospitals. They also provided schools and orphanages for the children of deceased members, sickness funds for members who were unable to work, relocation assistance to help workers go where the jobs were, and moral support to families in times of trouble.

In the early 20th Century, these organizations came under attack by the Progressive Movement, which opposed self-help as interfering with the preferred dependency on and loyalty to the State. The Progressives also disparaged traditional values such as thrift, which got in the way of an economy ever more dependent on consumer spending. One leader of the Progressives is quoted as arguing in 1916 that, “Democracy is the progress of all, through all, under the leadership of the wisest.” The idea that common workmen could provide for their own needs was offensive to those who thought only an educated elite could order the affairs of society.

Greg writes at length about this subject in a new paper from the Citizens’ Council for Health Freedom. The widespread provision of charity care and service was also a major factor – which has again been crowded out by government in the form of Obamacare:

As more Americans gain insurance under the federal health law, hospitals are rethinking their charity programs, with some scaling back help for those who could have signed up for coverage but didn’t.

The move is prompted by concerns that offering free or discounted care to low-income uninsured patients might dissuade them from getting government-subsidized coverage.

If a patient is eligible to purchase subsidized coverage through the law’s online marketplaces but doesn’t sign up, should hospitals “provide charity care on the same level of generosity as they were previously?” asks Peter Cunningham, a health policy expert at Virginia Commonwealth University.

Most hospitals are still wrestling with that question, but a few have gone ahead and changed their programs, Cunningham says.

The online charity care policy at Southern New Hampshire Medical Center in Nashua, for example, now states that “applicants who refuse to purchase federally-mandated health insurance when they are eligible to do so will not be awarded charitable care.”

The same rule disqualifies aid to those who refuse to apply for expanded Medicaid, which New Hampshire lawmakers voted to extend, beginning Aug. 15.

Little wonder that, given this type of crackdown on the charity care side of things and the expanded promise of coverage to new Medicaid recipients, hospitals are seeing another Emergency Room spike:

Experts thought if people bought health insurance through the Affordable Care Act, they would find a private doctor and stop using hospital emergency rooms for their primary care.

Well, more people have health insurance. But they are still crowding into emergency departments across the nation.

An online study by the American College of Emergency Room Physicians found that nearly half of its members have seen a rise in visits since Jan. 1 when ACA coverage began. A resounding 86 percent of the physicians said they expect that number to continue growing.

In Philadelphia, emergency room visits were 8 percent higher in June than in November 2013, according to the Delaware Valley Healthcare Council, which collects data from 70 percent of the region’s hospitals.

“We find that when people don’t have health care, there is a degree of pent-up demand,” said Alex Rosenau, the ER physicians’ group and an ER doctor in Allentown. “People finally feel like they can go get medical care once they have some insurance.”

The spike in emergency room visits isn’t totally surprising. Rosenau said when Massachusetts enacted its own health care reform in 2006, everyone predicted the newly insured would find a private doctor. Instead, emergency departments saw a 3 to 7 percent increase in volume.

“Insurance does not equal access,” said Rosenau, adding that his group believes everyone should have access to care. “They know when they go to the emergency department, they are going to be seen.”

Complicating the matter is the growing shortage of primary care physicians. People who have never had a private doctor may have trouble finding one. So they continue to rely on emergency rooms.

It’s almost as if the crowding-out effects of government can have negative or unanticipated ramifications, particularly when they impact and warp the decisions people make about their lives.

SOURCE

*******************************

Liberating the Poor from the Medicaid Ghetto

Medicaid is a massive federal-state entitlement program desperately in need of reform. Its mission is to provide health care to the poorest of the nation’s poor ... and thus the poor have the most to gain from positive reform efforts, says Peter Ferrara, a senior fellow for The Heartland Institute and author of a new Heartland Policy Brief, “Liberating the Poor from the Medicaid Ghetto.”

According to the Centers for Medicare and Medicaid Services, which administers Medicaid, federal and state government spending for the program will total $6.56 trillion between 2013 and 2022. Medicaid is already the biggest line item in state budgets, and Medicaid spending will continue to grow, especially in the states that extended the reach of their programs under the Patient Protection and Affordable Care Act. (About half the states enacted the Medicaid expansion provided for under Obamacare, while half did not.)

Ferrara notes the absurdly high-cost Medicaid program delivers tragically low-quality care. Hospitals and physicians resist taking Medicaid patients because the program reimburses providers only about 60 percent of their costs associated with delivering care. “Medicaid patients face difficulties in obtaining timely, essential health care, suffering from adverse health as a result,” Ferrara writes.

As he has done in previous installments of his entitlement reform series of Policy Briefs, Ferrara urges modernizing Medicaid by block-granting the federal government’s share of funding to the states. He writes:

The unwillingness of health care providers to accept Medicaid patients because of the program’s shamefully low reimbursement rates could be addressed by extending to Medicaid the 1996 reforms of the Aid to Families with Dependent Children (AFDC) program. ... Each state would be free to use the funds for its own redesigned health care safety net program for the poor in return for work from the able-bodied.

Ferrara notes Congressman Paul Ryan (R-WI) included Medicaid block grants in his 2012 and 2013 budgets, and generally “[s]upport for such fundamental entitlement reform is now mainstream within the Republican Party.” He writes, “The current Medicaid system is so disastrous that those who support it cannot realistically be seen as caring about the poor. Their opposition to reform exposes a radical, impractical, counterproductive ideology to which they are wedded because it maximizes their power.”

SOURCE

***************************

On Halloween, Dems will be Haunted by their ObamaCare Pasts

Like the grim reality of death, there's no escaping Democrats' support for ObamaCare.

There's been considerable speculation over whether ObamaCare would manifest itself as a major election issue this year, in light of the media's focus on Ebola and the turmoil in the Middle East. Well, any doubt that ObamaCare still matters can be put to rest, with the announcement that 13 states and the District of Columbia will be sending out hundreds of thousands of insurance cancellation notices by the end of October, mere days before the November 4th elections.

This has got to be worrying for Senate Democrats up for reelection, whose support for the Affordable Care Act that is the cause of these cancellations will surely not resonate well with voters.

In tight races across the country, Democrats may now see their earlier words come back to haunt them. For example, in Arkansas, incumbent Senator Mark Pryor is on record referring to ObamaCare as "an amazing success story." How Pryor can defend this claim as thousands more Americans get thrown off the insurance rolls is anybody's guess.

In Colorado, as Mark Udall tries to beat back a strong challenge from Republican Cory Gardner, he will have to bear the consequences for repeating the now-infamous lie: "If you have an insurance policy you like, doctor or medical facility that provides medical services for you, you'll be able to keep that doctor or that insurance policy."

In Louisiana, where Republican challenger Bill Cassidy is gaining ground against Mary Landrieu, the Democratic Senator will have to defend her claim that ObamaCare would drive down insurance costs for families and businesses.

Only one of these Democrats seems to have the sense to run from the president's signature policy, Bruce Braley in Iowa, who has referred to ObamaCare as a "big failure." It's clever political posturing, but rings awfully hollow in light of Braley's repeated refusal to vote for repeal or defunding of the law as a Member of the House of Representatives. It's just another substanceless campaign statement that makes Braley look like an empty suit, blowing feebly in the direction of the political winds.

In fact, all of the above Democrats supported ObamaCare, not only with their words, but with consistent, repeated votes to make sure the law continues to wreak havoc on the country's medical system.

As more Americans lose their health insurance in the days and weeks before the elections, Democrats are going to find it increasingly hard to hide to from their abysmal voting records.

SOURCE

*************************

We Need Good Preachers Before We Get Good Government

In his 1798 letter to the officers of the First Brigade of the Third Division of Massachusetts’ Militia, America’s second president, John Adams, made a famous observation about the U.S. Constitution: "We have no government armed with power capable of contending with human passions unbridled by morality and religion. Avarice, ambition, revenge, or gallantry would break the strongest cords of our Constitution as a whale goes through a net. Our Constitution was made only for a moral and religious people. It is wholly inadequate to the government of any other.”

Noting its limited scope and enumerated powers, Adams argued such a founding document would adequately govern given the personal and civic decorum and the decency of the citizens of the United States. Flip that coin over to understand that without the absolutes of right and wrong woven into the tapestry of a moral and religious people, an overreaching and excessive government would follow.

Dial the clock forward to 2014 America -- the nation devoted to the god of me, myself and I, rather than the Hand of Providence of earlier years. Consider the cries of discrimination, intolerance and even racism, when societal standards of what is right, decent and good are most perfectly summed up by the bumper sticker, “WHATEVER!”

This cultural casserole of conscience shuns “a moral and religious people” and heralds the governing elites who view their intellect as superior to the weak leaning on the crutch of faith and religion. These 21st century elites openly mock the belief in and reverence of the Judeo-Christian Deity who endows His creation with unalienable rights, demands personal responsibility, shows love and mercy through community benevolence and charity, and has a dim view of laziness, lying and corruption.

Yet a society composed of individuals who subscribe to honesty, individual discipline and industriousness, mutual respect of persons and property, along with a measure of good will and charity, is a free people. Such a society will enjoy Liberty driven not by external lists and constraints of law, but by internal goodness and the “Golden Rule.”

As we navigate the path toward the elections of 2014 and 2016, we ask this: Instead of winning the argument and exacting policy, isn’t the more bountiful fruit to sustain our Constitution’s limited government enjoyed by winning hearts and minds to live a life of faith?

Which brings us to former Arkansas governor and TV personality Mike Huckabee. Following last week's Supreme Court refusal to hear cases on same-sex marriage, Huckabee vowed he would leave the Republican Party if the fight against same-sex marriage and abortion did not continue as a primary political plank of the party. It's not the first such declaration from those of faith who seek higher office or lead in an elected position.

Yet a “house divided will not stand.” The nation's Mike Huckabees should be cautious in abandoning the political vehicle that most frequently and effectively opposes the party whose membership voted God out in the 2012 Democrat National Convention.

Isn’t it even more critical in this cultural battle that those of the Judeo-Christian faith season their environs by being the “salt of the earth” rather than taking their 50-pound salt block into isolation?

In using John Adams’ observation to inform our center-right pursuits, fiscal restraint and discipline, economic success and might, along with a populace of individual accountability and productivity, are more likely when our Judeo-Christian God informs our politics and drives our conduct.

One might say John Adams was observing that good preachers precede good government.

SOURCE

****************************

For more blog postings from me, see  TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, GREENIE WATCH,  POLITICAL CORRECTNESS WATCH, AUSTRALIAN POLITICS, and Paralipomena (Occasionally updated) and Coral reef compendium. (Updated as news items come in).  GUN WATCH is now mainly put together by Dean Weingarten.

List of backup or "mirror" sites here or  here -- for when blogspot is "down" or failing to  update.  Email me  here (Hotmail address). My Home Pages are here (Academic) or  here (Pictorial) or  here  (Personal)

****************************


Tuesday, October 14, 2014


Our Judicial Dictatorship

By Pat Buchanan

Do the states have the right to outlaw same-sex marriage?  Not long ago the question would have been seen as absurd. For every state regarded homosexual acts as crimes.

Moreover, the laws prohibiting same-sex marriage had all been enacted democratically, by statewide referenda, like Proposition 8 in California, or by Congress or elected state legislatures.

But today rogue judges and justices, appointed for life, answerable to no one, instruct a once-democratic republic on what laws we may and may not enact.

Last week, the Supreme Court refused to stop federal judges from overturning laws banning same-sex marriage. We are now told to expect the Supreme Court itself to discover in the Constitution a right of men to marry men and of women to marry women.

How, in little more than half a century, did the American people fall under the rule of a judicial dictatorship where judges and justices twist phrases in the Constitution to impose their alien ideology on this once-free people?

What brings the issue up is both the Court decision on same-sex marriage, and the death of my friend, Professor William J. Quirk, of the South Carolina University School of Law.

In "Judicial Dictatorship" (1995), Bill wrote of the revolution that had been imposed against the will of the majority, and of how Congress and the people might rout that revolution.

The instrument of revolution is judicial review, the doctrine that makes the Supreme Court the final arbiter, the decider, of what the Constitution says, and cedes to the Court limitless power to overturn laws enacted by the elective branches of government.

Jefferson said that to cede such authority to the Supreme Court "would place us under the despotism of an oligarchy." Was he not right?

Consider what has transpired in our lifetime.

The Supreme Court has ordered the de-Christianization of all public institutions in what was a predominantly Christian country. Christian holy days, holidays, Bibles, books, prayers and invocations were all declared to be impermissible in public schools and the public square.

Secular humanism became, through Supreme Court edict, our established religion in the United States.

And the American people took it.

Why was there not massive civil disobedience against this anti-Christian discrimination, as there was against segregation? Why did Congress, which has the power to abolish every federal district and appellate court and to restrict the jurisdiction of the Supreme Court, not act?

Each branch of government, wrote Jefferson, is "independent of the others and has an equal right to decide for itself what is the meaning of the Constitution in the cases submitted to its action."

"No branch has the absolute or final power to control the others, especially an unelected judiciary," added Quirk.

In 1954, the Supreme Court ordered the desegregation of all pubic schools. But when the Court began to dictate the racial balance of public schools, and order the forced busing of children based on race across cities and county lines to bring it about, a rebellion arose.

Only when resistance became national and a violent reaction began did our black-robed radicals back down.

Yet the Supreme Court was not deterred in its resolve to remake America. In 1973, the Court discovered the right to an abortion in the Ninth Amendment. Then it found, also hidden in the Constitution, the right to engage in homosexual sodomy.

When Congress enacted the Defense of Marriage Act, Bill Quirk urged it to utilize Article III, Section 2 of the Constitution, and write in a provision stripping the Supreme Court of any right to review the act.

Congress declined, and the Court, predictably, dumped over DOMA.

Republican presidents have also sought to curb the Supreme Court's aggressions through the appointment process. And largely failed.

Of four justices elevated by Nixon, three voted for Roe. Ford's nominee John Paul Stevens turned left. Two of Reagan's, Sandra Day O'Connor and Anthony Kennedy, went wobbly. Bush I's David Souter was soon caucusing with the liberals.

Today, there are four constitutionalists on the Court. If the GOP loses the White House in 2016, then the Court is gone, perhaps forever.

Yet, the deeper problem lies in congressional cowardice in refusing to use its constitutional power to rein in the Court.

Ultimately, the failure is one of conservatism itself.

Indeed, with neoconservatives in the van, the GOP hierarchy is today in headlong retreat on same-sex marriage. Its performance calls to mind the insight of that unreconstructed Confederate chaplain to Stonewall Jackson, Robert Lewis Dabney, on the failure of conservatives to halt the march of the egalitarians:

"American conservatism is merely the shadow that follows Radicalism as it moves forward towards perdition. It remains behind it, but never retards it, and always advances near its leader. . Its impotency is not hard, indeed, to explain. It is worthless because it is the conservatism of expediency only, and not of sturdy principle. It intends to risk nothing serious, for the sake of the truth, and has no idea of being guilty of the folly of martyrdom."  Amen

SOURCE

******************************

CDC Mission Creep: A Dangerous and Wasteful Distraction

By: Josh Withrow

Any time a new infectious disease arises in the United States or throughout the world, Americans are assured that the Centers for Disease Control and Prevention (CDC) is taking measures to prevent an outbreak from turning into an epidemic. Recently, however, the arrival of Ebola patients in United States has appeared to expose major flaws in the CDC's preparedness. A deeper examination of the CDC's focus and activities reveal an agency mired in classic mission creep, constantly nudged off-course by political pressures. Many of the CDC's uses over the past several decades are dubious enough on their own to recommend a reevaluation of the agency's reach. Recent events merely reinforce the need for the CDC to refocus on its vital primary mission: the prevention and control of infectious disease epidemics.

Brief History of CDC and Its Mission Creep

The Centers for Disease Control and Prevention was founded as the Communicable Disease Center in 1946, a small institution whose roots lay in the prevention of malaria and typhus outbreaks among U.S. troops in World War II. As its name suggested, the initial scope of CDC was mostly restricted to epidemiology  - the prevention and containment of large-scale outbreaks of deadly communicable diseases. While the CDC enjoyed a great deal of early success in its disease-containment mission, playing a key role in the eradication of polio in the U.S. and of smallpox worldwide, it quickly succumbed to mission creep and began to absorb a number of tangential or unrelated projects. By 2014, the CDC's mission has diversified to the point where its initial "Office of Infectious Diseases" is only one of five branches on its official organizational chart.

This fundamental overreach from the CDC's original (and essential) mission was fully codified in 1992, when the full title of the agency became the "Centers for Disease Control and Prevention".  Combined with the agency's move away from controlling only communicable diseases, the idea encapsulated within those two new words is frightening in its nearly limitless scope. One of the ways that the CDC has justified this mission creep is by employing a vastly expanded definition of the word "epidemic".  Although the word refers, strictly speaking, to the widespread outbreak of an infectious disease, the government and the CDC have taken to referring to anything that causes harm to a large number of people as an "epidemic". Thus, we get the "obesity epidemic," the "gun violence epidemic," the "alcoholism epidemic" -  as if America has experienced an epidemic of epidemics.

This substantial broadening of the CDC's jurisdiction has allowed it to be turned into a tool for political interests. Examples of CDC interventions into areas far outside the realm of infectious disease include policy recommendations to reduce obesity, salt consumption, tobacco and alcohol use, and even gun ownership. Notably, not only are all of these areas unrelated to the CDC's original infectious disease focus, they all overlap with other government agencies. Another major avenue for CDC mission creep has been its focus on "environmental health,"  which by their own definition includes "everything around us -  the air we breathe, the water we drink and use, and the food we consume."

 This practically limitless mandate has led the CDC to involve itself in such non-disease topics as global warming, building construction, nutrition, and even "the health effects of gentrification."

CDC's Budget Bloat

In the wake of recent accusations that the CDC was caught flat-footed by the arrival of the Ebola virus in America, some officials and lawmakers have pointed to the sequestration budget cuts and other funding reductions at the CDC. As The Federalist's  David Harsanyi notes, however, sequestration merely cut projected increases in spending, while a GAO report showed that most of the CDC's cutbacks came from decreases in grant funding.

 In fact, while the CDC complains about supposed draconian budget cuts, its budget in 2014 ($6.4 billion) was more than triple its budget from just 1996 ($2.1 billion). The largest increases were in response to specific disease scares  -  the anthrax scare in 2001, and the H1N1 avian flu threat in 2005  -  but notably the funding increases never went away. The CDC's budget request for FY 2015 remains at $6.6 billion total, already back to increasing annually after the modest cuts of 2013.

The most alarming aspect of the CDC's mission creep is not its budgetary impact (though  that is not insignificant); rather, it is the extent to which politically motivated research, funded by CDC grants, is used as scientific validation for invasive policy schemes government-wide.

CDC's Gun Violence Campaign Reined in by Congress after Venturing into Direct Lobbying

There are numerous examples of the CDC being used as a tool to attempt to advance political agendas at the federal and state level. Take, for example, the CDC's campaign against guns, which began in earnest with the creation of Intentional Injuries Section within the Division of Injury Epidemiology and Control.  Any notion that the division intended to merely study and observe gun violence was quickly dispelled, as the director gave an interview in 1993 in which it was revealed "he envisions a long-term campaign, similar to those on tobacco use and car safety, to convince Americans that guns are, first and foremost, a public health menace."

 Finally, after the CDC began giving grant money to organizations that advocated for gun control, and published articles giving advice on how to lobby for gun control, Congress intervened by defunding much of the agency's gun violence research and explicitly prohibiting the CDC from promoting gun control.

 In spite of this, President Obama has once again attempted to insert gun violence into the CDC's mandate, issuing a 2013 executive order asking the CDC to study "the causes of gun violence and ways to prevent it."

The gun control campaign was one of the federal government's more egregious attempts to use the CDC to advocate for issues well outside of the agency's mission, but there have been numerous similar examples of CDC overreach.

The Community Preventive Services Task Force: Science with an Agenda

Perhaps the best current example of the CDC's overreach into advocating for policy change is the Community Preventative Services Task Force, which "produces recommendations (and identifies evidence gaps) to help inform the decision making of federal, state, and local health departments, other government agencies, communities, healthcare providers, employers, schools and research organizations."

 The Task Force is a supposedly neutral assembly of scientists that deals only in objective research, independent of the CDC or any other government body. In reality, the CDC director appoints the Task Force's 15 members, employs its 41 support staff, and
ultimately disseminates the results and opinions from the Task Force's studies.

 Somehow, the 15 Task Force members, who only meet three times per year, are currently expected to cover 22 different topics, ranging from asthma to worksite safety to birth defects.

 Even if the Task Force members themselves are truly independent, there is simply no way that they can each develop informed opinions in that wide a variety of topics, which clearly indicates that the 41 CDC employees must be heavily involved with the selection, interpretation, and guidelines of the various studies.

The full list on the Task Force's Website: adolescent health, excessive consumption of alcohol, asthma, birth defects,
cancer, cardiovascular disease, diabetes, emergency preparedness, health communication, health equity, sexually transmitted diseases and pregnancy, mental health, motor vehicle injury, nutrition, obesity, oral health, physical activity, social environment, tobacco, vaccination, violence prevention, and worksite health promotion.

Since the end result of these studies is the aforementioned "recommendations" which are intended to be used by both public and private entities, this means that the CDC itself is effectively lobbying for government policy changes. These recommendations include clear instances of policy advocacy, including increasing alcohol and tobacco taxes in order to discourage consumption through higher prices. In effect, the Task Force and its studies are freely used as a government-funded think tank for those who wish to regulate Americans' personal health decisions.

ObamaCare Slush Fund Used by CDC to Lobby for State Policy Change

An even more blatant recent use of CDC dollars for lobbying has been its grant funding under the Prevention and Public Health Fund (PPHF). Created as part of ObamaCare, the PPHF provides for up to $2 billion each year to be spent without any congressional oversight, and much of this money each year has been allocated for use by the CDC. Some of the more ridiculous grants issued through the PPHF include studies of "dance fitness, massage therapy, painting bike lanes, salad bars in school cafeterias, pet neutering and urban gardening."

 In a 2012 letter to HHS, Senator Susan Collins (R-Maine) noted that even as a supporter of the CDC's public wellness campaigns, "I am concerned about the appearance of impropriety in several instances where grantees. appear to have used federal funds in attempts to change state and local policies and laws." In particular, Sen. Collins noted the guidelines for states to receive grants under the PPHF, which provided a list of strategies that recipients were "expected" to use "to produce the desired outcomes for the initiative." Several states then reported how they were using the money received under this program to implement some of these very strategies, including changing zoning laws, proposing tax increases on selected products, and increasing tobacco regulations.

 The HHS Inspector General concurred with Collins that the CDC's guidelines raised serious concerns about the use of their grant money.

CDC Used Faulty Study to Boost Obesity Campaign

Another high-profile campaign for the CDC, especially under recent administrations, has been a war against the obesity "epidemic". The CDC's activities in this arena have had many of the same lobbying concerns as previously mentioned campaigns. In fact, the current director of the CDC, Dr. Thomas Frieden, was hired to that post in 2009 straight from running New York City Mayor Bloomberg's infamous nanny-state campaigns against smoking, trans fats, and oversized sodas, which were run with CDC funding.

 Even before Dr. Frieden was hired, in 2004 the CDC was caught using a study that vastly inflated the numbers of actual obesity-related deaths in order to advance its advertising and policy recommendation campaigns. The CDC initially touted a study that showed deaths related to obesity had increased by 33 percent from the previous decade to 400,000 annually, but when its numbers were challenged the CDC admitted that the increase was less than 10 percent. Even scientists inside the CDC objected to the methodology used for the study, yet it was published and used anyway.

 Embarrassingly, a subsequent CDC-backed study found that when other factors were accounted for, the net total of obesity-caused deaths was 25,814  -  14 times fewer than their earlier estimate. Nevertheless, the CDC Director declared that her agency's anti-obesity campaign would neither scale back nor incorporate the new, smaller death toll.

 Clearly, in the case of its obesity data scandal, the CDC had no intention of letting science get in the way of its desired frightening narrative

More HERE

There is a  new  lot of postings by Chris Brand just up -- on his usual vastly "incorrect" themes of race, genes, IQ etc.

****************************

For more blog postings from me, see  TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, GREENIE WATCH,  POLITICAL CORRECTNESS WATCH, AUSTRALIAN POLITICS, and Paralipomena (Occasionally updated) and Coral reef compendium. (Updated as news items come in).  GUN WATCH is now mainly put together by Dean Weingarten.

List of backup or "mirror" sites here or  here -- for when blogspot is "down" or failing to  update.  Email me  here (Hotmail address). My Home Pages are here (Academic) or  here (Pictorial) or  here  (Personal)

****************************

Monday, October 13, 2014


Dubious support for a popular health scare

There is a substantial body of people who get their jollies out of finding "threats" to health in popular products. They obviously want to appear wiser than the rest of us poor sods.  There seems to be no accepted name for them but I call them "food & heath freaks".  They are probably best known for their unfounded demonization of salt, saturated fat and artificial sweeteners  such as Aspartame.  Sugar is their current big boogeyman.
   
A somewhat less well-known scare is about Bisphenol A, a component of many plastics.  A few molecules of BPA have been shown to leach out of plastic bottles into the liquid contained in the bottle.  For that reason plastic bably bottles have been more or less banned and glass baby bottles are mostly used instead.  Glass is of course fragile and dangerous when broken but any decrease in safety from its use in baby bottles is ignored by food & health freaks.

The question is, however, how toxic is ingested BPA?  Rats given enough of it certainly fall ill but as Paracelsus pointed out long ago, the toxicity is in the dose.  And it seems unlikely that a few molecules received from a plastic bottle are harmful.  And that is what most studies of the matter show.  Like a terrier that won't let go of a bone, however, "research" to detect harm goes on among the  food & heath freaks.

The latest stab at BPA has just come out in JAMA and I give the abstract below.  I have however read the whole article and I would summarize the results rather differently.  What they found was that the amount of BPA in the pregnant mother's blood correlated marginally significantly (p = .03) with the infant's lung function 4 years after birth but not 5 years after birth.  That is a very shaky finding indeed and shows, if anything, that BPA is safe.  They also looked at the correlation between mother-reported wheezing in the kid and BPA levels but that correlation failed to reach statistical significance (p = .11).

They do however rather desperately hang their hat on a correlation with wheeze drawn from the BPA concentration in the mother at 16 weeks.  That correlation had vanished at 26 weeks gestation however so again the results actually show that BPA is safe  -- no lasting ill-effects.

Not much there for the BPA freaks. I am not alone in that conclusion.   The abstract follows:

>>>>>>>>>>>>>>>>>>

Bisphenol A Exposure and the Development of Wheeze and Lung Function in Children Through Age 5 Years

Adam J. Spanier et al.

ABSTRACT

Importance
Bisphenol A (BPA), a prevalent endocrine-disrupting chemical, has been associated with wheezing in children, but few studies have examined its effect on lung function or wheeze in older children.

Objectives
To test whether BPA exposure is associated with lung function, with wheeze, and with pattern of wheeze in children during their first 5 years.

Design, Setting, and Participants
A birth cohort study, enrolled during early pregnancy in the greater Cincinnati, Ohio, area among 398 mother-infant dyads.
We collected maternal urine samples during pregnancy (at 16 and 26 weeks) and child urine samples annually to assess gestational and child BPA exposure.

Main Outcomes and Measures
We assessed parent-reported wheeze every 6 months for 5 years and measured child forced expiratory volume in the first second of expiration (FEV1) at age 4 and 5 years. We evaluated associations of BPA exposure with respiratory outcomes, including FEV1, child wheeze, and wheeze phenotype.

Results
Urinary BPA concentrations and FEV1 data were available for 208 children and urinary BPA concentrations and parent-reported wheeze data were available for 360 children. The mean maternal urinary BPA concentration ranged from 0.53 to 293.55 ‘g/g of creatinine. In multivariable analysis, every 10-fold increase in the mean maternal urinary BPA concentration was associated with a 14.2% (95% CI, -24.5% to -3.9%) decrease in the percentage predicted FEV1 at 4 years, but no association was found at 5 years. In multivariable analysis, every 10-fold increase in the mean maternal urinary BPA concentration was marginally associated with a 54.8% increase in the odds of wheezing (adjusted odds ratio, 1.55; 95% CI, 0.91-2.63). While the mean maternal urinary BPA concentration was not associated with wheeze phenotype, a 10-fold increase in the 16-week maternal urinary BPA concentration was associated with a 4.27-fold increase in the odds of persistent wheeze (adjusted odds ratio, 4.27; 95% CI, 1.37-13.30). Child urinary BPA concentrations were not associated with FEV1 or wheeze.

Conclusions and Relevance
These results provide evidence suggesting that prenatal but not postnatal exposure to BPA is associated with diminished lung function and the development of persistent wheeze in children.

SOURCE

***********************************

Why do so many liberals despise Christianity?

Liberals increasingly want to enforce a comprehensive, uniformly secular vision of the human good. And they see alternative visions of the good as increasingly intolerable.

Liberalism seems to have an irrational animus against Christianity. Consider these two stories highlighted in the last week by conservative Christian blogger Rod Dreher.

Item 1: In a widely discussed essay in Slate, author Brian Palmer writes about the prevalence of missionary doctors and nurses in Africa and their crucial role in treating those suffering from Ebola. Palmer tries to be fair-minded, but he nonetheless expresses "ambivalence," "suspicion," and "visceral discomfort" about the fact that these men and women are motivated to make "long-term commitments to address the health problems of poor Africans," to "risk their lives," and to accept poor compensation (and sometimes none at all) because of their Christian faith.

The question is why he considers this a problem.

Palmer mentions a lack of data and an absence of regulatory oversight. But he's honest enough to admit that these aren't the real reasons for his concern. The real reason is that he doesn't believe that missionaries are capable "of separating their religious work from their medical work," even when they vow not to proselytize their patients. And that, in his view, is unacceptable - apparently because he's an atheist and religion creeps him out. As he puts it, rather wanly, "It's great that these people are doing God's work, but do they have to talk about Him so much?"

That overriding distaste for religion leads Palmer to propose a radical corollary to the classical liberal ideal of a separation between church and state - one that goes far beyond politics, narrowly construed. Palmer thinks it's necessary to uphold a separation of "religion and health care."

Item 2: Gordon College, a small Christian school north of Boston, is facing the possibility of having its accreditation revoked by the higher education commission of the New England Association of Schools and Colleges, according to an article in the Boston Business Journal. Since accreditation determines a school's eligibility to participate in federal and state financial aid programs, and the eligibility of its students to be accepted into graduate programs and to meet requirements for professional licensure, revoking a school's accreditation is a big deal - and can even be a death sentence.

What has Gordon College done to jeopardize its accreditation? It has chosen to enforce a "life and conduct statement" that forbids "homosexual practice" on campus.

Now, one could imagine a situation in which such a statement might legitimately run afoul of an accreditation board or even anti-discrimination statutes and regulations - if, for example, it stated that being gay is a sign of innate depravity and that students who feel same-sex attraction should be subject to punishment for having such desires.

But that isn't the case here. At all. In accordance with traditional Christian teaching, Gordon College bans all sexual relationships outside of marriage, gay or straight, and it goes out of its way to say that its structures against homosexual acts apply only to behavior and not to same-sex desires or orientation.

The accreditation board is not so much objecting to the college's treatment of gays as it is rejecting the legitimacy of its devoutly Christian sexual beliefs.

The anti-missionary article and the story of Gordon College's troubles are both examples (among many others) of contemporary liberalism's irrational animus against religion in general and traditional forms of Christianity in particular.

My use of the term "irrational animus" isn't arbitrary. The Supreme Court has made "irrational animus" a cornerstone of its jurisprudence on gay rights. A law cannot stand if it can be shown to be motivated by rationally unjustifiable hostility to homosexuals, and on several occasions the court has declared that traditional religious objections to homosexuality are reducible to just such a motive.

But the urge to eliminate Christianity's influence on and legacy within our world can be its own form of irrational animus. The problem is not just the cavalier dismissal of people's long-established beliefs and the ways of life and traditions based on them. The problem is also the dogmatic denial of the beauty and wisdom contained within those beliefs, ways of life, and traditions. (You know, the kind of thing that leads a doctor to risk his life and forego a comfortable stateside livelihood in favor of treating deadly illness in dangerous, impoverished African cities and villages, all out of a love for Jesus Christ.)

Contemporary liberals increasingly think and talk like a class of self-satisfied commissars enforcing a comprehensive, uniformly secular vision of the human good. The idea that someone, somewhere might devote her life to an alternative vision of the good - one that clashes in some respects with liberalism's moral creed - is increasingly intolerable.

That is a betrayal of what's best in the liberal tradition.

Liberals should be pleased and express gratitude when people do good deeds, whether or not those deeds are motivated by faith. They should also be content to give voluntary associations (like religious colleges) wide latitude to orient themselves to visions of the human good rooted in traditions and experiences that transcend liberal modernity - provided they don't clash in a fundamental way with liberal ideals and institutions.

In the end, what we're seeing is an effort to greatly expand the list of beliefs, traditions, and ways of life that fundamentally clash with liberalism. That is an effort that no genuine liberal should want to succeed.

What happened to a liberalism of skepticism, modesty, humility, and openness to conflicting notions of the highest good? What happened to a liberalism of pluralism that recognizes that when people are allowed to search for truth in freedom, they are liable to seek and find it in a multitude of values, beliefs, and traditions? What happened to a liberalism that sees this diversity as one of the finest flowers of a free society rather than a threat to the liberal democratic order?

I don't have answers to these questions - and frankly, not a lot hinges on figuring out how we got here. What matters is that we acknowledge that something in the liberal mind has changed, and that we act to recover what has been lost.

SOURCE

******************************

Panetta's 'Worthy Fights' Over Obama's Ego

Leon Panetta's memoir, "Worthy Fights," is causing a big stir in Washington and beyond. Panetta was a major player in the president's national security team as CIA director and then defense secretary. The release of his book couldn't be more timely, and the way it's being received by the White House and the media couldn't be more telling of the current state of affairs in the Obama administration.

When Panetta came to the administration, he already had a well-established career in Democrat politics. He had served eight terms in Congress before Bill Clinton recruited him in 1993 to run the Office of Management and Budget. Panetta then became Clinton's chief of staff, taking on the job of bringing order to the political free-for-all that was the White House during the second half of Clinton's first term. After that, he spent time doing what politicos often do when they leave office - he established a policy group, lectured and did some teaching. Then he was tapped by Obama to head the CIA in 2009, and two years later, he became Pentagon chief, wrapping up his service shortly after the beginning of Obama's second term.

For those of us who see Obama's foreign policy for the malfeasance that it is, Panetta's grocery list of national security screw-ups doesn't come as a surprise. What's interesting is how he tries to walk a tightrope of offering praise for the president while skewering him at the same time. Panetta takes pains to hail Obama's keen intellect, as so many who have served with the president often do, but his recollections actually go on to refute that flattery.

Panetta recounts through several episodes that the president lacks the passion of a leader and repeatedly exhibits "a frustrating reticence to engage his opponents and rally support for his cause." Wouldn't someone with a keen intellect recognize that leadership is crucial to achieving his goal? And, if he believed in his ideas, wouldn't he be willing to actively defend them with logic rather than petulant political attacks on the opposition?

Iraq is a prime example of Panetta's account of Obama's poor leadership. He details how Obama basically sabotaged that country's future by letting his desire to fulfill a campaign pledge - get America out of Iraq - cloud the basic fact that America's military presence was integral to keeping the country together. The White House was "so eager to rid itself of Iraq," Panetta said, "that it was willing to withdraw rather than lock in arrangements that would preserve our influence and interests."

Furthermore, Panetta wrote, "My fear, as I voiced to the President and others, was that if the country split apart or slid back into the violence that we'd seen in the years immediately following the U.S. invasion, it could become a new haven for terrorists to plot attacks against the U.S." His stance, he said, "reflected not just my views but also those of the military commanders in the region and the Joint Chiefs." So Obama's "keen intellect" won out over his knowledgeable advisers.

Indecision combined with deliberately setting unrealistic expectations for Iraq's fragile government essentially sunk the status of forces agreement that the U.S. was trying to hammer out with then-Iraqi leader Nouri al-Maliki. Obama pleased his constituents, but Panetta argues the end result was "a vacuum in terms of the ability of that country to better protect itself, and it's out of that vacuum that [ISIL] began to breed." (Someone else warned about that too.) Now we've got boots back in the air, fighting what Panetta says should be a "long and sustained battle."

Panetta's motives aren't pure. He's obviously out to sell books, and he may even be angling for a position (secretary of state?) in a Hillary Clinton administration. But Panetta has also captured from the inside what we've been saying about Obama all along - essentially that the president is a narcissist who ignores wise advice in pursuit of his own ideological agenda. In Iraq, that's proved disastrous. And it's worth hammering home.

SOURCE

****************************

For more blog postings from me, see  TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, GREENIE WATCH,  POLITICAL CORRECTNESS WATCH, AUSTRALIAN POLITICS, and Paralipomena (Occasionally updated) and Coral reef compendium. (Updated as news items come in).  GUN WATCH is now mainly put together by Dean Weingarten.

List of backup or "mirror" sites here or  here -- for when blogspot is "down" or failing to  update.  Email me  here (Hotmail address). My Home Pages are here (Academic) or  here (Pictorial) or  here  (Personal)

****************************



Sunday, October 12, 2014


A Western Heart

Most of what I put online in my blogs is commentary and news reports written by others that I find interesting from a libertarian/conservative point of view.  So I could possibly be seen as a sort of Readers Digest for libertarian/conservatives.  In fact, however, on most days I do put up commentary somewhere on one of my blogs that I have written myself.  When such comments stretch to more than a sentence of two, therefore, I put them up on the blog A Western Heart (AWH) -- as a convenient way of keeping together my own writings  for my own reference.

The blog was originally created by a group of Australian bloggers but the rest of them all gradually burnt out -- leaving me as the only surviving blogger.  I am persistent if nothing else.

But another important reason for using AWH has to do with a certain search engine whose name begins with G.  For some reason not clear to me AWH gets a much higher priority in searches than do any of my other blogs.  If I search for some content that I have put up in more than one place, the AWH entry comes up first by far.  I don't know why but I am glad to take advantage of it

*************************

Obama



***************************

A little history, Thomas Jefferson Started A War Against Fundamentalist Islam Over 200 Years Ago

Most Americans are unaware of the fact that over two hundred years ago,the United States had declared war on Islam, and Thomas Jefferson led the charge! At the height of the eighteenth century, Muslim pirates were the terror of the Mediterranean and a large area of the North Atlantic. They attacked every ship in sight, and held the crews for exorbitant ransoms. Those taken hostage were subjected to barbaric treatment and wrote heart breaking letters home, begging their government and family members to pay whatever their Mohammedan captors demanded.

These extortionists of the high seas represented the Islamic nations of Tripoli, Tunis, Morocco, and Algiers - collectively referred to as the Barbary Coast - and presented a dangerous and unprovoked threat to the new American Republic.

Before the Revolutionary War, U.S. merchant ships had been under the protection of Great Britain. When the U.S. declared its independence and entered into war, the ships of the United States were protected by France. However, once the war was won, America had to protect its own fleets. Thus, the birth of the U.S. Navy.

Beginning in 1784, seventeen years before he would become president, Thomas Jefferson became America's Minister to France. That same year, the U.S. Congress sought to appease its Muslim adversaries by following in the footsteps of European nations who paid bribes to the Barbary States, rather than engaging them in war.

In July of 1785, Algerian pirates captured American ships, and the Day of Algiers demanded an unheard-of ransom of $60,000. It was a plain and simple case of extortion, and Thomas Jefferson was vehemently opposed to any further payments. Instead, he proposed to Congress the formation of a coalition of allied nations who together could force the Islamic states into peace. A disinterested Congress decided to pay the ransom.

In 1786, Thomas Jefferson and John Adams met with Tripoli's ambassador to Great Britain to ask by what right his nation attacked American ships and enslaved American citizens, and why Muslims held so much hostility towards America, a nation with which they had no previous contacts.

The two future presidents reported that Ambassador Sidi Haji Abdul Rahman Adja had answered that Islam "was founded on the Laws of their Prophet, that it was written in their Quran, that all nations who should not have acknowledged their authority were sinners, that it was their right and duty to make war upon them wherever they could be found, and to make slaves of all they could take as Prisoners, and that every Musselman (Muslim) who should be slain in Battle was sure to go to Paradise."

Despite this stunning admission of premeditated violence on non-Muslim nations, as well as the objections of many notable American leaders, including George Washington, who warned that caving in was both wrong and would only further embolden the enemy, for the following fifteen years, the American government paid the Muslims millions of dollars for the safe passage of American ships or the return of American hostages. The payments in ransom and tribute amounted to over twenty percent of the United States government annual revenues in 1800.

Jefferson was disgusted. Shortly after his being sworn in as the third President of the United States in 1801, the Pashaof Tripoli sent him a note demanding the immediate payment of $225,000 plus $25,000 a year for every year forthcoming. That changed everything.

Jefferson let the Pasha know, in no uncertain terms, what he could do with his demand. The Pasha responded by cutting down the flagpole at the American consulate and declared war on the United States. Tunis, Morocco, and Algiers immediately followed suit.

Jefferson, until now, had been against America raising a naval force for anything beyond coastal defense, but having watched his nation be cowed by Islamic thuggery for long enough, decided that it was finally time to meet force with force.

PaintingOfPirateShipBurningInTripoliHarbor1804

Painting of "Pirate Ship Burning in Tripoli Harbor" 1804 .. U.S. Navy Archive

He dispatched a squadron of frigates to the Mediterranean and taught the Muslim of the Barbary Coast a lesson he hoped they would never forget. Congress authorized Jefferson to empower U.S. ships to seize all vessels and goods of the Pasha of Tripoli and to "cause to be done all other acts of precaution or hostility as the state of war would justify".

When Algiers and Tunis, who were both accustomed to American cowardice and acquiescence, saw the newly independent United States had both the will and the might to strike back, they quickly abandoned their allegiance to Tripoli.

The war with Tripoli lasted for four more years, and raged up again in 1815. The bravery of the U.S. Marine Corps in these wars led to the line "to the shores of Tripoli" in the Marine Hymn. They would forever be known as "leathernecks" for the leather collars of their uniforms, designed to prevent their heads from being cut off by the Muslim scimitars when boarding enemy ships.

Islam, and what its Barbary followers justified doing in the name of their prophet and their god, disturbed Jefferson quite deeply. America had a tradition of religious tolerance; the fact that Jefferson, himself, had co-authored the Virginia Statute for Religious Freedom, but fundamentalist Islam was like no other religion the world had ever seen. A religion based on supremacism, whose holy book not only condoned but mandated violence against unbelievers was unacceptable to him. His greatest fear was that someday this brand of Islam would return and pose an even greater threat to the United States.

This should bother every American. That the Islams have brought about women-only classes and swimming times at taxpayer-funded universities and public pools; that Christians, Jews, and Hindus have been banned from serving on juries where Muslim defendants are being judged, Piggy banks and Porky Pig tissue dispensers have been banned from workplaces because they offend Islamist sensibilities. Ice cream has been discontinued at certain Burger King locations because the picture on the wrapper looks similar to the Arabic script for Allah, public schools are pulling pork from their menus, on and from in the newspapers….

It's death by a thousand cuts, or inch-by-inch as some refer to it, and most Americans have no idea that this battle is being waged every day across America. By not fighting back, by allowing groups to obfuscate what is really happening, and not insisting that the Islamists adapt to our own culture, the United States is cutting its own throat with a politically correct knife, and helping to further the Islamists agenda.

Sadly, it appears that today's America would rather be politically correct than victorious.

SOURCE

Footnote:  The North African pirates were eventually wiped out when in 1830 the restored French monarchy sent 600 ships to the other side of the Mediterranean and took over North Africa.  The invasion was chaotic but the defence was feeble so the French won

*********************************

Campaign finance reform isn't about "getting money out of politics," it's about silencing political dissent

Senate Democrats recently tried to push through a constitutional amendment that would have repealed free speech protections in the First Amendment, making Congress the sole arbiter of what is and isn't political speech. Thankfully, this effort, backed by Majority Leader Harry Reid (D-NV) failed to get the two-thirds needed for a constitutional majority, killing the proposed amendment for the remainder of the 113th Congress.

Though this effort failed, there stands a good chance that Democrats will, at some point down the road, launch another attempt to repeal the First Amendment, and it will again come under the guise of the Orwellian phrase "campaign finance reform." This phrase may sound nice, but the consequence, as George Will explains in a new video, is the silencing of political speech.

"We Americans are disposed to think that the word 'reform' is a synonym for 'improvement.' But what is called 'campaign finance reform' is nothing less than a frontal assault on the first, the most fundamental of our freedoms -- the freedom to speak our mind and to participate in politics," says Will, a Pulitzer Prize-winning columnist. "This assault is always conducted stealthily by people who pretend that they only want to regulate money, not speech. They say they are only concerned about the quantity of money in politics."

"You must remember this: People who say there is too much money in politics are necessarily saying three very sinister things. First, they're saying there is too much political speech. Second, they are saying that they know just the right amount of political speech. And third, they are saying that government should enforce the limits they want on the amount of political speech. That is, the government should regulate speech about the government," Will adds.

Despite the feel-good rhetoric Americans so frequently hear from so-called "campaign finance reformers," these efforts aren't about the presence of money in politics, but rather incumbent protection. Campaign finance laws are written by politicians to insulate themselves against criticism and accountability from constituents back home at the expense of one of our most cherished civil liberties.

SOURCE

*******************************

How Government Creates Poverty

John Stossel

Fifty years ago, President Lyndon Johnson declared "War on Poverty." It sounded great to me. I was taught at Princeton, "We're a rich country. All we have to do is tax the rich, and then use that money to create programs that will lift the poor out of poverty." Government created job-training programs for the strong and expanded social security for the weak.

It seemed to work. The poverty rate dropped from 17 percent to 12 percent in the programs' first decade. Unfortunately, few people noticed that during the half-decade before the "War," the rate dropped from 22 percent to 17 percent. Without big government, Americans were already lifting themselves out of poverty!

Johnson's War brought further progress, but progress then stopped. It stopped because government is not good at making a distinction between needy and lazy. It taught moms not to marry the father of their kids because that would reduce their welfare benefits. Welfare invited people to be dependent. Some people started to say, "Entry-level jobs are for suckers." Many could live almost as well without the hassle of work.

Despite spending an astonishing $22 trillion dollars, despite 92 different government welfare programs, poverty stopped declining. Government's answer? Spend more!

Rep. Paul Ryan (R.-Wis.), chairman of the House Budget Committee, points out that government measures "success" by the growth of programs: "based on inputs, how much money are we spending, how many programs are we creating, how many people are we putting on these programs—not on outcomes—how many people are we getting out of poverty? ... Many of these programs end up disincentivizing work, telling people it pays not to go to work because you'll lose more in benefits than you gain in earning wages."

That doesn't mean the poor are lazy. It means they respond to incentives. They are rational about choosing behaviors that, at least in the short term, pay off.

It's not only welfare that makes it harder for the poor to climb the ladder of success. Well-intended laws, such as a minimum wage, hurt, too. But most people don't understand that. Even Republicans, according to opinion polls, support a higher minimum wage. A minimum sounds compassionate. It's hard to live on $7.25 an hour.

But setting a minimum is anything but compassionate because that eliminates starter jobs. The minimum wage is why kids don't work as apprentices anymore, nor clean your windshield at gas stations. They never get hired because employers reason, "If I must pay $9, I'm not taking a chance on a beginner."

To most economists, the claim that the minimum wage kills starter jobs is not controversial. But it is among the general public. And so politicians pander.

On my TV show this week, Rep. Jim McDermott (D.-Wash.) says that people like Paul Ryan and I "just want to cut the size of government. And trust the private sector to do everything."  Well ... yes. The private sector does just about everything better.

McDermott says, "This whole business about somehow raising the minimum wage causes a loss of jobs—if that's true, why don't we just drop the minimum wage altogether and let people work for a dollar a day or $1 an hour?"

OK, let's do it! It's not as if wages are set by the minimum wage. That is a great conceit of the central planners: thinking that only government prevents employers from paying workers nearly nothing. But the reason Americans don't work for $1 an hour is competition, not government minimums. Competition is what forces companies to pay workers more. It doesn't much matter that the law says they can pay as low as $7.25. Only 4 percent of American workers now make that little; 95 percent make more.

The free market will sort this out, if politicians would just let it. Left free, the market will provide the greatest benefit to workers, employers, and consumers, while allowing charity as well.

It would all happen faster if politicians stopped imagining that they are the cause of everything.

SOURCE

***************************

An alarming double standard

Beck asked: “Can you think of a reason, an honest reason, that we have not banned the flights from West Africa yet? Why we’re not stopping this?”

“No,” Levin responded. “I think this is absolute insanity. Well, it’s obviously Obama. The reasons are obvious, because he doesn’t want to appear to be conducting himself in a way that discriminates against that continent.”

“You know what’s really crazy?” Beck added. “It took him all of 30 seconds to ban flights to the state of Israel. He banned flights to Israel over a suspected rocket launch. … This time, we have actual Ebola in our own hometowns, and he’s not banning any flights. Won’t even consider it.”

More HERE

****************************

For more blog postings from me, see  TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, GREENIE WATCH,  POLITICAL CORRECTNESS WATCH, AUSTRALIAN POLITICS, and Paralipomena (Occasionally updated) and Coral reef compendium. (Updated as news items come in).  GUN WATCH is now mainly put together by Dean Weingarten.

List of backup or "mirror" sites here or  here -- for when blogspot is "down" or failing to  update.  Email me  here (Hotmail address). My Home Pages are here (Academic) or  here (Pictorial) or  here  (Personal)

****************************