Blog Feature

By: Jennifer Devitt on February 3rd, 2015

Print/Save as PDF

Change as a direct result of technology, when does it become the norm?


Change. It's been said by many that "nobody likes change". Humans are by nature creatures of habit. We can be stubborn, set in our ways and find it easy to glide along on cruise-control.

Technology. It's defined by Webster's as "the use of science in industry, engineering, etc., to invent useful things or to solve problems" or "a machine, piece of equipment, method, etc., that is created by technology".

For many, embracing technology means embracing change. Oftentimes the two words go hand in hand. You could say that technology equals change. Here we are in 2015 and many people are still fighting both technology and as a result change. At what point is it not considered change anymore?

I have been part of a local leadership group this year. And one thing that stands out to me in a room full of managers and business owners is how they view technology in day to day functions.  Based on multiple comments, it is still viewed in a negative light and it is still being resisted. Now, I have yet to formally meet most individuals in the group, we are a group of approximately 50 individuals from varying businesses and industries. However, I think I can say without a doubt that I am the most pro-technology, pro-change individual in the room. You see, they have rules on stowing your phone. Ok, I get it, show your speaker respect. But at the same token if we are business owners we should be able to responsibly monitor our tech. The one thing I would like is the ability to take notes via my tablet with attached keyboard or via my MacBook. Now, there are several younger individuals in the room, so it's not as though it's an age specific issue. But the general feelings about technology in the room is negative.

Now, this perplexes me on several levels. The first "smartphone" called "Simon" as created by Bell South and IBM went on sale in August of 1994. Declaring the first laptop is a bit more difficult but can be traced back to approximately 1979. And, the first iPad was announced by Steve Jobs on January 27, 2010. So, suffice it to say these devices as we all know are not new.

Classrooms are outfitted with smart boards, computers, and iPads. Teachers and students utilize Google Docs on a regular basis. In offices and conference rooms around the country laptops power presentations, controlled by a remote control. Even in my Thursday morning sessions, the presenter uses a laptop for their slide presentation. Yet, it would be frowned upon if I took digital notes. I write very few things these days and let's just say my once perfected, catholic school education penmanship has suffered. I would prefer to have digital copies of my notes or be able to quickly research a suggested book or topic later by the click of a mouse.

So, with smartphones and laptops being over twenty years old and iPads being in the marketplace for five years now, at what point does "change" not apply?  At what point is using technology in place of paper and pencil considered the norm? Students in classrooms around the world are taking more and more exams digitally. Doctors are using tablets instead of lugging around paper files for patient information and for viewing x-rays and other test results.

How do you use technology in everyday life? How do you balance its usage in meetings and or classes and presentations? With the prevalence of technology in society in 2015, should there not be a degree of trust for executives or employees to rely on their technology? After all, these devices are information at our finger-tips. They could be used to fact-check, add valuable information or for spontaneous research to add value to a topic or help solve an issue that has come up. At what point does the change factor get thrown out the window? Wouldn't you agree that these types of technology have been around long enough and evolved to the point of the mainstream?