When Nostalgia is Stupid

NostalgiaThe Save button vexes me, or at least the icon for the Save button.Save Button

I turned on my computer at work this morning and was once again confronted by the diskette save button that seems ubiquitous on most software. I’m also annoyed by the designation of the C: drive.

Why is the image for the save button a diskette? (no, it’s not a floppy disk, that’s something else, go look it up, I’m too lazy to show you).

There was a good reason the save button was a diskette. The diskette was, for a short period of time, the way we saved files. It could hold up to 1.44 MB of data and every computer in the world had a B: drive which could read from the diskette. Yes, the B: drive. The reason it was the B: drive was because the aforementioned floppy disk was the A: drive. The internal drive eventually came along and was called the C: drive.

You may have noticed that there are no drives for floppy disks or diskettes anymore. That we now save files either on the network, the web, or a removable USB drive. The diskette no longer has a place in the world. There is no A: or B: drive on your computer but there is certainly a strange little icon we use for saving things. I’ve often heard it referenced as “The little TV set button”.

People in this world have a fondness for something called nostalgia. We remember past events with what are called rose-colored glasses. This is something we see every day in the world and it’s not necessarily a good thing.

Wikipedia tells us: The term nostalgia describes a sentimentality for the past, typically for a period or place with happy personal associations.

I’m of the opinion that a over-reliance on nostalgia can be a negative influence on life. When we base our behavior on misconceptions about the past it is bound to make our lives worse.

As an example; there were good things about simpler times in the 19th century but there were also awful things as well. Women had few if any rights. Racism and anti-Semitism were the norms not the exceptions. If we yearn for a simpler time without understanding that we also give away the ability to travel long distances easily, eat oranges outside warm regions, have cold drinks, and many of the other things modern society offers us, we do a disservice to ourselves.

My point in all this? I want to end the Save button icon as a diskette. It makes no sense to the modern users and thus can cause confusion. It’s bad in general. Death to the Diskette!

P.S. Can someone please make the C: drive the A: drive and end that madness?

Tom Liberman
Sword and Sorcery fantasy with a Libertarian Ideology
Current Release: The Spear of the Hunt
Next Release: The Broken Throne

No Coders, No Code

Computer CodeThere was an interesting article today in Small Business news about how the United States is not graduating many people who know how to write computer code. We don’t have teachers who can teach computer code.

The article points out that it is a growing field that requires far more people each year than are graduating from college. To a certain degree this is capitalism at work. If there are not enough people to fill a job, the salary for that job goes up and attracts more people.

Here’s the problem. Salaries aren’t going up because there are plenty of people to fill those job. They just aren’t from the United States. In countries like India, China, and Russia they are graduating large numbers of people with coding skills. The rest of the world is churning out scientists while the U.S. has a smaller and smaller percentage of their college graduates filling these niches.

As I’ve said before, I’m thrilled that the so-called Third World is changing their society in a capitalistic fashion. It’s great that China graduated seven million students from college last year. That India and the European Union are growing as well. When the world becomes filled with educated people who can do technical jobs with a high level of skill it helps everyone. That’s a great thing.

Women are becoming empowered. The birth rate and population growth is slowing and may soon even become negative! These are good things for our world.

What’s bad is that the United States is in danger of falling behind. We still graduate many students with scientific degrees, with the ability to write computer code, and who excel in all fields. That being said, the trend is not looking promising.

The success of the free market and capitalism is infecting the world. Oppressive nations cannot hide the lifestyle of those who live in modern, western countries. People who see that it can be better, want it better. The internet has made the world aware that it’s possible to have a good job, a nice house, and plenty of food.

This change has inspired nations like China and India and that’s good.

If China, India, and other nations start to produce all the best scientific minds, the best computer programmers, the finest researchers, and the most strident capitalists; what will happen to the United States? Will we be the world’s leading economy? Will we be wealthy and prosperous?

We face challenging times in the United States.

While our politicians play games and offer false solutions we sit by in idle leisure, generally happy with our lot in life. We have a roof over our heads, food in the pantry, and entertainment to consume. We are content to blame the other party for all ills without bothering to look in the mirror.

I’ve talked long enough about problems. What can we do to stop this trend?

I offer no easy solutions. Teach people critical thinking skills from kindergarten on up. Teach people how to think. Give them the tools they need to succeed in life. People with critical thinking skills realize that learning to write computer code will guarantee them a job and a decent salary in the modern world. They will not blame everyone else for what is wrong with their lives. They will not end up in a dead-end job and a miserable life. They will enrich their own lives and the lives of everyone around them.

The years are rolling past and time waits for no one. The modern world requires people literate with technology. The societies that produce the highest numbers of these people will become dominant. Those that do not will fall by the wayside. Not this year, not next year, but the wheels are in motion.

Tom Liberman
Sword and Sorcery fantasy with a Libertarian Ideology
Current Release: The Sword of Water ($2.99 for a full length novel)
Next Release: The Spear of the Hunt (Out very soon!)

Air Force Software Development – $1 billion wasted

OverbudgetThere was an interesting and still developing story in the news recently about money paid by the government for a piece of software that, to date, doesn’t work.

The reason it caught my attention was not really the dollar amount assigned to the waste but the fact that it was a software development project. That’s something my company does and I’ve been taking an increasing part in that process myself. I helped formulate my first bid recently and am beginning to get a more personal understanding of the concepts involved.

In this case the Air Force contracted for a complex piece of software that would do the job of many other pieces of software in a sort of unified system. There aren’t many details in the story and the contractor in question claims the software largely works. The Air Force spokesman says it does not.

What I want to talk about today is not necessarily the failure of this software but the entire idea of making the bidding process work.

My company is currently working on a piece of software that we seriously underbid. It’s an undertaking that has been going on for years. The thing that’s important to understand is that everyone loses. The company doesn’t have its software and we continue to throw man-hours at the problem without any extra pay. The problem largely arises from poor bidding practices. If the contract had been bid appropriately maybe the company would have said, no way, too expensive. They would have saved money and so would my company for we have spent for more in man-hours than we received in payment.

I see the bidding process with government agencies to be a mixed bag. Some agencies seem to be able to accept appropriate bids while others, particularly the defense department, seem willing to accept artificially low bids only to see projects fail to complete on time and arrive hugely over-budget.

This doesn’t work for anyone. The company that makes the low-bid ends up with the contract certainly but the amount of work they do is not commiserate with the pay and can turn into a losing situation for everyone, see the F35 debacle. The government does not get the equipment or at least only receives some substandard version of the equipment.

In this case what bothers me most is that the company that made the bad bid originally is still being contracted for a number of other government software programs. At the very end of the video they mention another $8 billion in software bids that apparently returned little or nothing.

As with my own company, this kind of thing can happen. People can underestimate bids, things can prove more complex than originally imagined. However, a company that fails this miserably should not get any more money. I don’t think that is the case with some government contracts. They are largely so rife with corruption that a fair and reasonable bid has no chance of getting the contract. I do think this is department dependent. Some departments manage their bids better than others.

The question becomes, how do we manage the bidding process to get the best product at a fair price? With billions and even trillions of dollars at stake the idea that we can remove corruption entirely from the process is naive. With that much money at stake unsavory sorts are going to be drawn in.

Capitalism means that the company making the bid should make money. The contract should then be fulfilled within a reasonable percentage of the original bid and a quality product delivered.

Sadly, I’m of the opinion that the money is so immense and the corruption so entrenched that there are no easy answers. An independent agency with the sole job of evaluating bids seems like a good idea but that adds complexity and cost because you have to pay those people. Possibly some sort of metric based system in which the quality of the final product and the proximity to the original bid are assigned numeric values. These values are accumulated over time to favor bidders with good track records. I’m generally in favor of such metric based systems although corruption in assigning values is still possible.

It’s a huge problem, not so much from the wasted billions, but the idea that if a company regularly fails to properly fulfill bids, said business should not continue to prosper. The very heart of capitalism, of Randian Objectivism, is rewarding success.

I’ve spoken about this many times. If we reward failure the system rots from the inside. This is not capitalism, however, it certainly is what we seem to have today.

Sword and Sorcery fantasy with a Libertarian Twist
Current Release: The Sword of Water
Next Release: The Spear of the Hunt

Artificial Intelligence

Artificial Intelligence

Artificial IntelligenceToday Intelligence Week takes a turn towards science fiction as I examine the concept of Artificial Intelligence. I’ve spoken about IQ test and trying to define intelligence, now it’s time to determine if computer intelligence is already or will someday soon surpass that of humans.

By many units of measurement computers are already far, far smarter than humans. Their biggest advantage is speed in information retrieval and their biggest disadvantage is in perception using senses.

One of the things I find interesting is the difference in definitions for human intelligence and computer intelligence. The same standards do not apply. I wonder how a computer would do on an IQ exam?

Still, I understand the differences. When measuring intelligence in a human we are working with a subject that we define as intelligent. All humans have reasoning abilities of one sort or another (hold the jokes) whereas with computers we are trying to determine if one can be created that thinks, for lack of a better term, like a human.

The definition of computer intelligence has been fairly well standardized in a series of problems to be overcome. The list is too lengthy and complex to cover here but it boils down to making computers accomplish tasks that humans can do with ease. Things like planning, learning, using social skills, and creativity.

Progress has been made on many fronts but I’ll give two quick examples. A computer answering machine named Watson recently won a Jeopardy competition over top-level human foes. This represents an important step-forward in artificial intelligence. Another example is that, since the victory in 1997 by Deep Blue, chess computers are better players than the strongest human.

The Jeopardy victory in particular is interesting because it shows that computers are now capable of acting as Help Desk attendants in much the same way as the Computer on Star Trek helps the crew members. Imagine a superfast machine with access to an immense database on the other side of the phone instead of today’s automated system or a person reading from a script. This is something to welcome, not fear, although I know I’ll have a hard time convincing people of this argument.

It seems inevitable to me whether or not a computer actually achieves “artificial intelligence” their role in our lives is going to increase dramatically. And that’s how I want to wrap up this post, with some thoughts about what computer intelligence means for us in the near future.

Intelligence in computers mated with advances in robotics, a topic I’d like to take on soon, is well on the way to changing our lives. Robotic helpers with access to huge amounts of information will soon, I think, greet us on the phone, over the counter, when we arrive home, at school, and at work. Computer algorithms already help us tremendously every day if you think about our use of search engines.

The concept of computer intelligence is summed up in an idea called a Technological Singularity. There are many promises and dangers in this concept but i don’t want to spend too much time it today. Suffice it to say that computers are getting smarter and will continue to take a more active role in our lives, for good and for ill.

I guess that’s my final conclusion. It doesn’t really matter if a computer achieves the title of “Artificial Intelligence” or not. We are going to continue to improve computers and they will continue to play ever more important roles in our lives. The definition of intelligent doesn’t really change the fact of the matter. If the Cylons or Berserkers are the result or if R. Daneel Olivaw is the result, well, that’s where we are headed.

Tom Liberman
Sword and Sorcery fantasy with a Libertarian Twist

Internet Week – DARPA

World Wide WebIt’s hard to believe that there was no such thing as the Internet and the World Wide Web not that long ago. I’m going to take this week to praise some of the men and women who are responsible for our ability to communicate and transfer information via things like this blogs.

Let’s start with DARPA. According to Wiki, Defense Advanced Research Projects Agency is an agency of the United States Department of Defense responsible for the development of new technology for use by the military. It was originally created as a way to avoid being surprised by foreign nations’ technology as had happened with Sputnik. Telling aspects of DARPA are its size and management philosophy. It currently employs about 140 highly skilled people, only two levels of management, and the freedom to hire and fire who it desires without standard government rules. All positions are rotated regularly and people are hired generally for four to six-year terms. They understand failure is a necessary component of innovation and eventual success.

Now, onto how DARPA invented the internet.

A computer scientist named J. C. R. Licklider conceived the idea of sending information from computer to computer as a network and became a project director at what was then named ARPA. He assembled a team to see this vision through. One of his team members, Bob Taylor, then created a plan and opened it up for bidding to contractors. A company called BBN Technologies won the bid.

The first network messages was routed on the campus of UCLA on Oct 29, 1969 about a year and a half after Licklider conceived the idea. It caused the system to crash! In November of that year UCLA connected permanently with another station at Stanford. By 1973 foreign countries, Norway initially, began to connect to the system. In 1975 it was declared operational and turned over to the Department of Defense.

A lot has happened between then and now and I’ll talk about that as the week progresses but for the moment I want to focus on the ideas behind DARPA and some of its successes and past projects.

DARPA is probably as close a thing as we have to Ayn Rand’s concept of Galt’s Gulch in Atlas Shrugged. It is a place where intelligent and motivated people are allowed to pursue their dreams. The ideas brought to reality by DARPA include the Internet, The Aspen Movie Map (think about every movie you watch on the internet), drones and other unmanned vehicles which are increasing in use both private, public, and government, something called the Semantic Web which helps us find information more easily and was pioneered by a fellow you’ve never heard of named Tim Berners-Lee. You’ll hear a lot about him later in the week. Well, the list goes on and on.

My point here is to think about what kind of world we would live in if everyone worked in a DARPA like environment. The problem is that most people don’t have the ability of the chosen few in Galt’s Gulch and DARPA. I’ve discussed this before but the way to make it happen is through proper education. It’s important to teach children to think critically about everything to which they are exposed. Critical thinking leads to everything else. We must reward people for achievement and understand failure is a part of that process. This, by the way, is one of my biggest problem’s with Atlas Shrugged and Ayn Rand’s philosophy in general. Her characters are too archetypal and seem to me to be unrealistic. There are no John Galts in the world but we do everyone a service when we give the John Galt wannabes an opportunity to fail and to succeed.

Dare to dream but make a plan of action, envision obstacles and solutions, hire competent people, reward achievers, and make the world something beyond imagination!

Tom Liberman
Sword and Sorcery fantasy with a Libertarian Twist

Science Rocks

Science Week – Computers

ScienceI don’t think anyone takes computers for granted these days so there isn’t a lot of sense in telling everyone how important they are in the world. Instead I want to talk about how they, more than any politician, altered the economic landscape of the United States and mention of few of the most important names in the field. It’s important to understand why computer technology kept the U.S. as the world’s leading economy and why we are now, once again, in some danger of losing that power.

So my loyal followers, dig into your closets, find that oft used Time Travel cap, and place it firmly upon your head as we go … back … back … back to 1971.

Computers have been around for quite some time with even the ancients using calculating machines. I’m skipping past the fascinating stories of Hero of Alexandria, Wilhelm Shickard,  Charles Xavier Thomas, Ryoichio Yazu,  Joseph Marie Jacquard, Charles Babbage, Herman Hollerith, Arthur Pollen, and Konrad Zuse among a host of others. If you’ve time and inclination these are all interesting stories. However, I’m skipping ahead a bit.

In 1971 Intel developed the microprocessor for a Japanese computer company based on an invention of Robert Dennard. What I think is important here is that a U.S. company built it for a Japanese company. At this time Japan’s economy was growing while the U.S. was beginning a period of stagnation. Japanese cars were flooding the market and American consumers rightly found them to be superior to home built vehicles. Technology from Asia was beginning a flow that continues to this day with China leading the way.

Then in 1975 a little machine called the Altair 8800 was introduced and a group of young Americans began to play with it.  A couple of young fellows named Paul Allen and Bill Gates wrote something called a BASIC Interpreter for it. Two other young guys, Steve Jobs and Steve Wozniak began to work on their own versions of home computers.

Now, I’m going to leave aside all the name dropping and get back to the economics of computers and how they changed the landscape of U.S. power. By the late 1970’s there was a feeling that the U.S. was losing it’s place as the preeminent economy in the world. Gasoline embargoes and the rise of Asian technological advances contributed to a perception that probably had some merit if was overblown.

Computers changed all that. With companies like Microsoft, Apple, a reinvigorated IBM, Hewlett Packard, Xerox, Commodore, and a host of others suddenly pumping huge sums of money into the economy and paying massive tax bills our economy grew at an astonishing rate. The link between economic growth and technical achievement is strong. However, the boost we gained from computers is waning as it does with all new technology. There are some arguments that this boost was less than others throughout history.

With new technology our living standards improve dramatically, our work week declines, our free time increases, and our buying power increases. I think many of these things are directly attributable to the rise of computers and their related technologies.

The lesson I take from all this is that if we want to continue to improve our lives then we need to continue to invest in emerging technologies and particularly reward entrepreneurship. Too much of late I see Crony Capitalism and regulations designed to empower the established businesses at the expense of the small innovators.

This is a core message of Ayn Rand and Objectivism. If the big companies squeezed out Microsoft, Apple and others with regulations and government intervention our lives would have suffered. The individual achiever must be allowed to innovate and achieve and then we all benefit.

In my opinion, the next new technology is alternate energy. If we continue to invest heavily in subsidies for oil we will fall behind other nations researching nuclear, wind, solar, wave, thermal and other sources of power. If this happens will will lose our place as the most powerful economy in the world. I’ll take that topic on in more detail soon.

For now I simply want to say thank you to all the men and women who bring me computer technology! Gentlemen, Ladies, thank you! Maybe you can take the time to head down into the little cave where your IT staff resides eating donuts and making fun of the technologically illiterate. Ignore the odors, the dank depression, the wild eyed maniac drooling in the corner, and any other strange things you might see, pop your head in with a cheery smile and say, “Thanks!” Then get out of there while you still can!

Tell me what you think in the comments. Like, Tweet, Stumble, Pinterest, PlusOne, and otherwise share with your friends if you think this is worthwhile subject matter.

Tom Liberman
Sword and Sorcery fantasy with a Libertarian Twist

Teaser – Science Week – Computers

ScienceYes, amazingly Science Week continues at the behest of my thousands of fans! Tomorrow I take on a subject near and dear to my heart, computers. My personal employment depends on computers and they have changed the world. I’ll look at the early days of computer development and the effect they have on the economy of the United States.

You might learn a few things you didn’t know about men like Bill Gates and Steve Jobs and you will almost certainly gain a new appreciation for a personal hero of mine, Sir Tim Berners-Lee. And let’s not forget everyone’s favorite … Al Gore!

Stay tuned for day four of Science Week!

Tom Liberman
Sword and Sorcery fantasy with a Libertarian Twist