Category: Technology

#Twitter?

This is truly an indication that I’m becoming my parents.  I don’t get Twitter.  It seems to me that people use it for the same reason people buy People, InStyle,…

This is truly an indication that I’m becoming my parents.  I don’t get Twitter.  It seems to me that people use it for the same reason people buy People, InStyle, Us Weekly, etc., which I don’t get either.  Why are people so interested in the foibles of celebrities?  Why do people follow other people on Twitter, especially those they should expect will never follow them?  It seems to me that Twitter is a place to match people with self-image and confidence issues together.

I decided that I’d see what “experts” say about this.  Those that dealt with why people are fascinated with others’ lives  was summarized by Medical Daily as follows:

  1. Gossip affects the brain.  Chinese researchers asked students how each bit of gossip made them feel once they were done. And perhaps unsurprisingly, the students admitted they preferred to hear positive gossip about themselves and negative gossip about their friends and celebrities. However, while they claimed they had no preference over who they heard negative gossip about, scans of their brain activity showed otherwise.  Among these participants, the caudate nucleus — a brain region associated with pleasure and reward — showed “moderately strong” activity when the students were told negative celebrity gossip, an increase in activity when compared to hearing negative peer gossip. What’s more, brain scans also showed activity in regions associated with self-control when the participants heard celebrity gossip.
  2. People like bad news.  While celebrity bad news may be our favorite, humans are actually quite eager to read about any type of misfortune. A 2007 survey by the Pew Research Center for People found American news preferences have remained “surprisingly static” over the last 20 years, with war and terrorism being the subjects of the most popular headlines since the study began in 1986. News on bad weather and crime were also notably popular throughout the decades. This propensity for bad news spans the global population. A 2003 study on word association showed that people respond quicker to negative words, such as “cancer,” “bomb,” and “war,” than they would more positive words, such as “smile” and “fun.” This suggests a natural inclination toward the macabre, and news outlets know it — hence the popular journalism phrase, “If it bleeds, it leads.” Our inclination toward bad news is also sometimes termed “negative bias.” We all possess it to some degree, and it’s actually helpful, as it’s a possible side effect of the fight-or-flight response. According to The BBC, bad news acts as a threat, signaling that we need to change our behavior in order to avoid danger. In other words, we love to see what mistakes celebrities are making in their personal lives, so we can then avoid making those same mistakes in our own lives.
  3. It provides an escape from daily routines.  Gossip does more than satisfy an innate human instinct, however — it actually brings us true enjoyment. For some people, learning about the secret lives of people, what happens behind the scenes is a way to escape from their daily routine. The juicier the news, the better.

Stuart Fischer, an emeritus professor of media psychology at the University of UCLA, says preoccupation with the lives of others isn’t exactly unhealthy. In some cases, he says, it can actually be beneficial to our psychology. People who lack social skills, for example, can use gossip as a base to bond with others with the same interests.

On Twitter use in general, Owens Thomas summarized:

The Times of London asked experts about the Twitter phenomenon, and concluded that people use the Internet message-broadcasting service to send 140-character “tweets” relating their most mundane activities because of an underdeveloped sense of the self.

The clinical psychologist Oliver James has his reservations. “Twittering stems from a lack of identity. It’s a constant update of who you are, what you are, where you are. Nobody would Twitter if they had a strong sense of identity.”

“We are the most narcissistic age ever,” agrees Dr David Lewis, a cognitive neuropsychologist and director of research based at the University of Sussex. “Using Twitter suggests a level of insecurity whereby, unless people recognize you, you cease to exist. It may stave off insecurity in the short term, but it won’t cure it.”

For Alain de Botton, author of Status Anxiety and the forthcoming The Pleasures and Sorrows of Work, Twitter represents “a way of making sure you are permanently connected to somebody and somebody is permanently connected to you, proving that you are alive. It’s like when a parent goes into a child’s room to check the child is still breathing. It is a giant baby monitor.”

Politico checked in on the service’s use in the nation’s capital, and found that the vainglorious pundits and lawmakers who crave attention in print and on TV have also flocked to Twitter. The media at large, a class of people who define themselves by the size of their audience, have turned themselves into the Twitterati, building up lists of “followers” as a reassurance that they have an importance that will outlast their dying employers.

But the narcissism of today’s over-communicators transcends one little startup, and goes far beyond the makers of media. The Washington Post profiled Julie Zingeser, a 15-year-old girl who sent and received 6,473 texts in a single month. Her mother worries about Julie’s ability to focus. Sherry Turkle, an MIT professor, worries about deeper issues.

Sherry Turkle, a professor at the Massachusetts Institute of Technology, wonders whether texting and similar technologies might affect the ability to be alone and whether feelings are no longer feelings unless they are shared. “It’s so seductive,” she said. “It meets some very deep need to always be connected, but then it turns out that always being trivially connected has a lot of problems that come with it.”

What do you think this about the emerging governance via Twitter?

So there!  Though I signed up for Twitter years ago I have never used it and see no need to use it.  Talk about self-esteem!

Comments Off on #Twitter?

I don’t like self-driving cars

Why do they bother me?  It’s the loss of independence.  Would you trust your self-driving 4×4 to take you from one end of the Rubicon Trail to the other arriving…

Why do they bother me?  It’s the loss of independence.  Would you trust your self-driving 4×4 to take you from one end of the Rubicon Trail to the other arriving unscratched, let alone all of the passenger alive?  OK, I know that about 99.9999% of everyone driving around in their 4WD, all-wheel drive and whatnot never intend to take them off a paved surface unless it’s a gravel parking lot at the local pumpkin patch.  And most of those cool guys with jacked up 4×4’s with 10’ lift kits haven’t a clue; they raise not only their vehicle so they can mount those massive tires, but also their center of gravity.  But since they’ll probably never drive off-road they don’t have to worry about flipping their vehicle—unless they drive over a curb.  To be honest I have to admit I have a Toyota Tacoma 4×4 with a 3” lift kit to accommodate my slightly larger than OEM tires.  We also own a mini-van and a Prius.

My “mega truck”

WAKE UP!  I know you thought this would be on a more ethereal level and not a screed about 4×4’s.  So let’s get serious.  Given the above, I believe it will be awhile before we get rid of human driven vehicles, though the vehicle in your driveway might become a collector’s item sooner than you think.  Even tractor trailers might become autonomous soon if Mercedes has its way.  But not all trucks are that easily automated.  What about dump trucks, tow trucks, ambulances and fire trucks, or trucks with wide loads?  Hey and what about motorcycles?

 

Why are these “exceptions” important?  It’s because today the accident rate for self-driving cars is twice that of non-self-driving cars and the reason is all the other vehicles on the road.  The problem is that autonomous cars follow the rules of the road and most human drivers don’t.  Autonomous cars are more likely to be rear-ended by an inattentive human than human driven vehicles.

 

Not only do autonomous cars follow the rules, but they are learning machines.  That means that as they drive they learn.  Therefore they are not programmed for every possible situation they might encounter; they learn as they go.  This complicates things a bit since the “code” changes itself as it learns.  This makes troubleshooting and reconstructing events a bit troublesome.  This will evidently lead to legal issues yet to be imagined.  Who’s at fault when an autonomous vehicle is at fault in an incident?  This has already popped up in San Francisco where a motorcyclist claimed an autonomous vehicle hit him and he’s suing GM, even though he was the one cited.

 

Clearly this technology will require a major cultural change; maybe the millennials can handle it but I don’t think I can.

Comments Off on I don’t like self-driving cars

Are you smarter than the average 14-year old?

OK, I do have a BS in math and an MS in Computer Science, but that’s from the punch card era.  My son considers himself an expert on all devices…

OK, I do have a BS in math and an MS in Computer Science, but that’s from the punch card era.  My son considers himself an expert on all devices electronic since he can follow his teachers’ instructions and successfully post Google Doc assignments online.  In his opinion even more impressive is that he can do a corkscrew spin in a B-29 bomber without damaging the plane and his virtual crew.  So I was surprised when the other day he came to me in a panic because his phone had been hacked.  What?

 

He said that he knew it because no matter what he did all it does is go through the beginning of the phone’s boot before shutting down.  I doubted it.  First I had to explain that the problem is probably his battery is dead.  Impossible he said since it would start booting, it the battery was dead that couldn’t happen.  I explained that phones and computers in general, have a second, smaller battery, to save BIOS settings when “the battery” dies or in the case of computers they’re powered off.  To prove it I got a new battery and sure enough it didn’t charge either.  I asked what that could possibly mean.  He had no idea other than his phone had been really massively hacked.

 

I explained that the symptoms would indicate that there was probably a problem between the outlet and the battery contacts.  Of course he was flummoxed as to how to test this.  So first we replaced his charger with the one from my phone—didn’t work, and then we replaced the power cord from my phone—didn’t work either.  I noticed that the cord wiggled around in phone’s connector, so I asked if he had dropped his phone while he had it plugged in and of course he had.  He had apparently broken the contacts in the connector.  A multimeter confirmed this.  Use of a multimeter was another later lesson along with one to teach him the symptoms associated with the loss of an internet connection.

 

Is it just me?

 

Comments Off on Are you smarter than the average 14-year old?

It’s a gooey world!

Why have a dash when there’s steering wheel?  (Maserati Boomerang )   I remember the days before string theory and quantum mechanics when if you had half a brain you…

Why have a dash when there’s steering wheel?  (Maserati Boomerang )

 

I remember the days before string theory and quantum mechanics when if you had half a brain you knew pretty much everything you needed to know to live.  Even me; someone who sucked at sports but excelled in math, science, and even English, could not only find and identify a carburetor, but fix one too.

 

But with the advent of complex stuff to aid mankind survive our day to day lives ordinary things have become almost magical.  Because computers have always been a bit hard for the average homo sapien to comprehend they have been dumbed-down by using graphical user interfaces (GUI, pronounced gooey).  No more line commands typed into the console required.  This has spread to cover all aspects of our lives.  Can you adjust the carburetor in your car?  Of course not, it doesn’t exist anymore.  It’s now a fuel injection system, nothing here that doesn’t require a computer and some software.

 

So have you ever wondered why there’s a tachometer in the dashboard of your automatic transmission car?  When automatic transmissions first came out the tachometer disappeared from the dash in most cars, but it’s back!  Most people I’ve asked don’t even know what a tachometer is let alone why they need the information it imparts.  I would think manufacturers would bring back the battery level so it was more like a smartphone.  I’m sure this will happen when everyone is driving electric vehicles.  Even my wife has forgotten that the reason a battery is even in a combustion engine vehicle is so you don’t have to crank start it.  It’s not really there to power the entertainment system and other electronic devices.  What how could the battery be dead?  I wasn’t even driving the car!

 

Then there’s our daughter who dresses more for form than fashion.  She commonly complains that the car heater is defective; it takes too much time before there’s any heat.  I had to explain that the heat comes from the engine and since we don’t own a nuclear powered vehicle it takes time for the engine to warm up (really “heat up”—it becomes more than warm).

 

It seems to me that today people are more about memorizing the details than understanding the concepts.  I’m convinced that Standards of Learning tests help reinforce this.  But then maybe it’s always been this way, that’s why there are so many people qualified for government jobs.

 

 

Comments Off on It’s a gooey world!

Are you ready for when AI isn’t artificial anymore?

            Are you ready for when AI isn’t artificial anymore?I earned an MS in Computer Science back in what has been called the “AI winter.” …

 

 

 

 

 

Your brain on AI

 

Are you ready for when AI isn’t artificial anymore?
I earned an MS in Computer Science back in what has been called the “AI winter.”  That was when AI was more theoretical than practice.  Our mainframe was a Digital PDP-11 booted from paper tape in a freezing room surrounded by tape drives.  I wrote my thesis using WordStar on an Altos microcomputer powered by an 8-bit Z80 processor with 64K of memory.  It was one of the top of the line computers for its time; it even had two 5¼” floppy drives!

 

Now there’s the “world famous” IBM Watson, known for beating humans at chess, go, and Jeopardy using AI.

 

According to Wikipedia artificial intelligence (AI) is:

 

Intelligence displayed by machines, in contrast with the natural intelligence (NI) displayed by humans and other animals. In computer science AI research is defined as the study of “intelligent agents“: any device that perceives its environment and takes actions that maximize its chance of success at some goal. Colloquially, the term “artificial intelligence” is applied when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem solving”.

 

There are those who can hardly wait for this technology to become mature enough to implant extensions of the cloud into our brains (called hybrid thinking).  Does this sound like a good idea?  There is even the Cyborg Foundation (“IT’S TIME FOR TRANS-SPECIES TO COME OUT OF THE CLOSET”), whose stated mission is, “…to help humans become cyborgs, to promote the use of cybernetics as part of the human body and to defend cyborg rights.”  It’s bad enough that your computer can be hacked or infected with malware; imagine if your brain was.  Talk about identity theft!

 

Ironically this is exactly one of the reasons this technology is getting so much attention.  Identity assurance is a major concern in today’s workplace and everyday life.  Nymi is working on this problem, but since their solution is a band that can be removed it’s not considered the ultimate solution.  One of the proposed ultimate solutions is embedding a chip in everyone to establish their unique identity.

 

What makes AI different from other “computer programs” is that it learns and as it learns it modifies itself so it becomes better at whatever task or tasks it is assigned.  This has some interesting implications, the most important of which is that it can’t be easily audited like “normal code” can be.  The complex way AI grows and improves makes it harder to understand and even harder to control.  There are many out there like IBM’s Grady Booch who believe it is possible to “raise an AI” system to be a responsible citizen.  After thousands of years we still have a hard time raise our kids to be responsible, what make us think we can raise AI to be perfectly responsible?

 

A company, OpenAI, trained an AI to maximize its score in a virtual boat-race game, but instead of navigating the course as quickly as possible; the AI taught itself to exploit a flaw in the game by driving in circles and racking up bonus points while crashing into walls and setting itself on fire.  They say that an IA tasked with maximizing profits for a corporation—and given the seemly innocuous ability to post things on the internet—could deliberately cause political upheaval in order to benefit from a change in commodity prices.

 

Are you ready to become a cyborg?  Are you ready to turn your life over to AI?

 

Comments Off on Are you ready for when AI isn’t artificial anymore?

Type on the field below and hit Enter/Return to search