Sunday, September 25, 2016

Wage Disparities and Racial Discrimination




Wage Disparities and Racial Discrimination      v.2016




Some things are important to get right.  Actually, it would be better-stated to say some things are too important to continually get wrong. After decades of anti-discrimination policy the wage gap between white workers and black workers is not getting better. The current trend is that the gap is growing.

Some of my past experiences with hiring policy worked along these lines: initiate policies and once in place, hopefully, they will be effective. In the absence of a complaint there was no way to know whether these efforts were working or just ineffectually idling away in the background. Safeguards against discrimination are often treated this way. The EEOC is proposing that employers report a breakdown of their worker's pay by race, ethnicity, and gender. Is this necessary?

No matter how you measure it people of color are paid less than their white coworkers. A recent Economic Policy Institute study found that after accounting for all other factors (aside from discrimination) the wage gap persists. The key finding is, when you take all other factors into account, such as education and experience, the pay disparities exist, but are unexplained.

The point is this: when a company sets forth policies to lessen the potential for discrimination, the results are rarely evident within the currently-being-collected data. You do not see a result. Anti-discrimination policies are merely good intentions unless they are effective and the practice of hiring more people of color, alone, does not solve the problem. The EEOC may be on the right track here. A recent article by Gene Demby of NPR's Code Switch (see article) encapsulates this well.

The EEOC proposal requires that the pay data be reported by September 30, 2017. While private employers with 99 or fewer employees would be exempt from adding pay data to an EEO-1 filing, it would be useful internal information for smaller businesses.

However well-intentioned your company policies, if your numbers are off, the effectiveness of your anti-discrimination practices may be off as well.



The EEOC fact sheet for the proposal is available here: Notice of Proposed Changes to the EEO-1 Report to Collect Pay Data from Certain Employers

Friday, July 4, 2014

Independence Day






Independence Day

On this July 4th, while celebrating American Independence, I cannot help but look forward to September 18th; another bid for separation.  This time it’s Scotland leaving the UK.  Unfathomably, the issue has not yet captured mainstream interest within the US.  The Independence Referendum (Indyref) will be held exactly 700 years after the Battle of Bannockburn: the first war for Scottish Independence.  The Indyref is a complicated decision, but one that resonates across time- or should for us Americans. 

Why should we care?  First, in many ways we have a shared heritage.  Nine of the signers of the Declaration of Independence were Scots.  The document was, in fact, modeled after the Scottish Declaration of Independence: the Declaration of Arbroath.  A majority of the Governors of the original 13 States were of Scottish ancestry.  Scottish Americans were instrumental in gaining independence and the creation of a lasting democratic republic.  Today, 25 million Americans claim Scottish heritage.  The US is Scotland’s largest trading partner and its geostrategic position make it make it a desirable defensive ally.   

Risky business.  During a speech given to business leaders and party activists July 3, 2014, UK Prime Minister David Cameron urged a “silent majority” of Scots to stand up for a ‘no’ vote.  Cameron characterized this group as “…the silent majority who don’t want the risks of going it alone…” 

But, the Tory leader didn’t say risky for whom.

In a nation of 5.2 million people, Scotland is estimated to hold $2,500 billion in oil and gas reserves- to say nothing of $7.3 billion from whisky exports in 2013 alone.

Self determination.  If independence happens, there will be many things to work out.  Our American Declaration of Independence was signed by members of the Continental Congress in 1776, but wasn’t put in force until 1788.  Even then, things got off to a rocky start when federal powers of taxation were challenged by protest in 1791.  There may have been a commonality in what the ex-colonists didn’t want, but it took decades to interpret and delineate federal powers and finally to develop into a cohesive national identity.  Whatever the outcome, the decision of whether to continue or end a political union is in the hands of the Scottish voters.    


And, I cannot help but wonder, will there be a Scottish sky filled with fireworks on September 18th?  



Thursday, May 22, 2014

Why Are People Kissing Camels?



Social media is rife… well, with lots of things.  Sometimes, odd things.  This time it’s kissing camels.  Since it has come to light that camels are the likely link to Middle East Respiratory Syndrome (MERS), Saudi Arabian health authorities are warning that people should wear masks and gloves around the animals.  They further warn that folks should avoid sick camels; avoid raw camel meat, and milk.  I don’t know about you, but the news has completely put me off camel tartare. 

In Saudi Arabia camels are much more than cuddly, lovable, spitting pack animals.  They are an extremely valuable commodity.  The beasts are a significant source of income for many, but they are also bred for racing, hauling, and kept as pets.  Camel fanciers and farmers alike are treating these health warnings as unmerited.   All this camel kissing is in protest against what is perceived as an over-reaction on the part of the Saudi government health officials. 

There are hundreds of thousands of camels in Saudi Arabia.  The MERS virus has killed over 170 according to the World Health Organization, and has been traced back to camels.  While this is not absolutely conclusive, it would appear to be a combination of facts that would favor precaution.

The camel farmers, however, are having none of it.  Facebook, Twitter, and YouTube, are replete with a sort of camel-Saudi snogfest.

And, that’s why people are kissing camels.   ...See, that’s not really weird at all.












Tuesday, May 20, 2014

Hey, What Could Go Wrong?


Information and information technology is growing at an exponential rate.  Artificial intelligence (AI) shows promise to change how we do business, practice medicine, and operate our households.  Advances in knowledge about the human brain are now being applied to computing.  Computing systems are being built based on how the biological nervous system works.  Computers will be able to learn from experiences, consider the information for use in different situations, and even learn from their mistakes.  Pretty soon, personal assistants, bookkeepers, and data entry employees may be digital employees.

But what if the AI was the boss?  A Japanese venture capital firm, Deep Knowledge, just named a robot to its board of directors.  The artificial intelligence, named Vital, was elected to the board because of its superiority in identifying market trends; trends “not immediately obvious to humans.”  The AI will eventually get to have an equal vote on all financial decisions. 

If this Vital is smart, I bet it will vote to bring some of its AI buddies on board.  Wait!   …it is smart- super smart.  Of course it will … and then- out with the slow humans!  Humans will just delay optimum allocations and slow processing to their abysmally inferior pace.  Who wants to wait all those additional nanoseconds? 

I don’t want to hurt any AI’s feelings, but I think I just might find it irksome being fired by my company’s new software package. 

Last month world renowned physicist Stephen Hawking gave the opinion that machine superintelligence could be the most significant thing to ever happen in human history – and possibly, the last.  Hawking and colleagues warn:

One can imagine such technology outsmarting financial markets, out-inventing human researchers out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.’

At the Centre for the Study of Existential Risk at the University of Cambridge http://cser.org/ these possibilities are being considered:  the idea that “developing technologies might lead – perhaps accidentally, and perhaps very rapidly, once a certain point is reached – to direct, extinction-level threats to our species.”

Well, I personally hope that doesn’t happen.  Up till now I had just been worrying about asteroid impacts or annihilation through earth’s processes.  In any case we will soon have the need for a new genderless pronoun for our digital comrades.

…and, possibly, considerations for a new special interest group.   








Monday, May 19, 2014

You're Wrong ...About My Being Right




Thinking is hard work.  Thinking about thinking; more so.  What influences our thinking also influences the conclusions that we arrive at.  Like everything else, improvement in thinking is borne out in practice.  Our thinking can be of poor quality.  It’s something I have been contemplating. 

I have been guilty of superficial thinking, intellectual laziness, and inconsistent thought.  Worse yet, I have sometimes regurgitated an opinion of a respected expert or peer without gathering enough information to form my own conclusions based on facts. 

I’m just cynic enough to believe that many people do not really want to learn anything.  They avoid conversations that do not support their views.  Rather than responding with facts, they seek to discredit those that they disagree with.  It has taken some introspection to separate my general perception of myself from actual fact.  I too have been guilty of using loaded terms and evoking an emotional response- just to garner a win for my side.  In the end, I may be “winning”, but not positively influencing; and, I’m certainly not learning anything myself.

I have recently taken some small steps towards becoming an active critic of my thinking.  In doing so, I have found the following guidelines helpful:

  • Facts should not simply be absorbed.  Facts should always be accompanied with questions. 
  • Don’t jump to conclusions.  Find out more.  Find out more from opposing viewpoints.
  • Listen.  Don’t just wait for your opportunity for rebuttal- listen.
  • Stay focused.  Finish thinking a problem or issue through.  Don’t divert.  What is pertinent?  What is merely a distraction?

As humans, we’re hard-wired for social interaction.  In groups, learning is not as important as conveying that we are like-minded with our peers.  If we’re not careful we allow others or groups to do our thinking for us.  The quality of our thinking is critical to efficiently solving problems.  Quality thinking allows us to draw appropriate conclusions and reduce the amount of time suffering the consequences of irrational thought.


"We learn more by looking for the answer to a question and not finding it 

than we do from learning the answer itself"         ~ Lloyd Alexander

Sunday, May 18, 2014

Honest debate?

I belong to a small minority.  I have not yet declared my official position on global climate change.  I should re-phrase that: my position on man-made global climate change.  It gets me yelled at a lot.  This strikes me as odd.  I haven’t declared my position on man-made global volcanism, tectonic subduction, or polar shift either, but those issues don’t get me yelled at.  I guess I’m an easy target because I’m a skeptic.  Wait, wait… before you start yelling- I didn’t say I was convinced against it.  I’m still reading.

Part of the problem is my interest in previous ice ages.  When did they start, when did we come out of them; how did they start, how did we begin to emerge out of them?  Since earth’s most recent ice age things have been warming.  This warming was accompanied by a corresponding rise in sea level as ice sheets melted.  The effects were immense.  Much of what we see in our terrain today is the result of glaciation and melt.  The Great Lakes, for instance, were created from glacial scour and pooling.
  

 LAND MASS ABOUT 12,000 YEARS AGO

About 15,000 years ago the glaciers began to retreat and some 10,000 years ago the big melt started slowing.  About 7,000 years later coastal wetlands began to form.  So, in very rough terms our current sea levels were attained in the last 3,000 years or so.  I guess to some, that was what the temperature and sea levels were supposed to be all along.

We made it!  … now, to keep it that way. 

None of that means that modern man is not the cause of a lot of potential havoc.  We’re the culprit for a great deal of harm to our planet.  It’s just that both sides of the argument are immovable and deaf to any dissent.  It’s unfortunate because public policy is based on this information.  I’d like to take this information seriously, but there are a lot of fallacious arguments, and some past predictions just didn’t pan out.  About seven years ago, Al Gore told us that all sea ice would be gone by 2013.  Also notable:

  • Within a few years children just aren’t going to know what snow is.  Snowfall will be a very rare and exciting event.  ~Dr. David Viner, senior climate research scientist.  March, 2000.
  • The world will be eleven degrees colder by the year 2000.  ~Kenneth Watt, in Earth Day, 1970.

There’s a lot of compelling physical evidence that what we’re doing to our planet is grossly damaging.  On the other hand, a lot of government funding is directed at science that achieves the right results.  How am I as a non-scientist to know what to think?

In 2005 Science magazine warned that we could anticipate a catastrophic trend towards more frequent and intense hurricanes as a direct result of man’s impacts on our planet.  Some hurricane researchers agreed and some hurricane researchers disagreed. 



Whichever viewpoint that I select as being the more compelling will earn me a label from one camp or the other, and assuredly- more yelling.

On Being an Arrogant Bastard

On Being a Luddite  a Denier
an Arrogant Bastard

I cannot help but be impressed with how much the world of academia disdains dissenting viewpoints.  For that matter, I am impressed by the immovability both the left and the right on social thought.  For certain loaded subjects, I'm confronted with: "absolute consensus", "irrefutability", "complete certainty."  There are terms for those who just might be skeptical of these claims.  Some aren't very nice.  Lately, there's even a trend towards pathologisation of dissent.  If I disagree, I probably have a mental illness.  This is not offered up by one or two crack-pot editorialists, but soberly studied at a college near you.  

It's somewhat confusing.  In 2012, CERN reported approaching the seven sigma threshold for observation of the Higgs boson.  So, at a 0.0000000001% chance that they haven't, some physicists still felt the need to clear their throats and say "Higgs-like" particle.  So, wait... If my personal opinion is that they might not have actually found the Higgs boson- I'm not labeled a crazy Higgs denouncer?  Rational skepticism- even at 7 sigma?  

The take-away here is that only certain causes and "right-thinking" garner such vehemence.  Mention those subjects as a skeptic and you just may become acquainted with new pejorative terms.

There appears to be a label for each type of abhorrent ideal and a trend towards vilification of standard-bearers.  Often, these perceived purveyors of evil have no real power.  Interestingly, it appears to be the underlying intent of the speech that is objectionable- and the fact that those people could have an ideological following among the cretinous masses.  It’s true: you would do well to fact check information from certain sources.  But why must I be lumped in with those groups just because I share one belief in common?  Moreover, why must it be assumed that I draw the same conclusions?   

And here, I flatter myself, my views trump those of my overly opinionated friends.  Politically, I do not champion (nor withdraw support to) a candidate because of their race.  I do not advance (nor discount) someone based on their lifestyle.  I do not bother myself with the rantings of popular figureheads with no real political power and label them as evil or their views as “frightening”- and, I don't think my ideals are so vastly superior to my neighbor's that they should be foisted upon all for the good of society.


It would seem that no one wants to know how and why unless it bolsters one’s previously held belief, or counters someone else’s belief.  I have been accused of dogmatically championing outdated ideas.  Well… yes.  Where moral codes and my beliefs of basic right-and-wrong are concerned, I trend towards unyielding.  Where concepts of politics, human interactions, and science are involved, my opinions are ever evolving.  There are often good reasons to change one’s opinion.  New facts come to light and scientific breakthroughs consistently turn old science on its ear- big shifts in thinking; in nutrition, cosmology, physics, and biology.  If you haven’t discovered a compelling reason to change your opinion about some long-held belief in the last 10 years, you aren’t thinking- you’re just remembering.