Posted by: jbeanblossom | November 23, 2015

What would dementia be like without many memories to fall back on?

When visiting my grandpa a few months after his open-heart surgery, the elephant in the room was that he literally saw elephants in the dining room. After a 4-year struggle through dementia, I was relieved when my grandpa’s journey through the fog was finally over. But even through this period of his life, which was often frustrating for him and everyone around him, he had his memories.

Nicholas Carr’s 9th chapter of The Shallows posits our exponential reliance on the Internet because of overstimulation, which may have a startling negative impact on our ability to retain memory. (The irony is that mobile apps to improve memory retention are very popular.)

The Pew Research Center ran an interesting article compiling studies that state Americans aren’t dying like we used to. In fact we’re living longer, “[b]ut the downside of living longer is the higher rates of dementia, senility and Alzheimer’s in the population, which are also more costly. In 2010, the Centers for Disease Control and Prevention documented 180,021 such cases compared with just 293 such cases in 1968.”

If Carr’s research about our decline in memory retention is true, what will the future hold for a generation of Americans that are living longer, but who may face the same, if not worsened chances, for living with memory-related diseases?

Posted by: Joe Kuffner | November 23, 2015

There is No Was

While reading chapter nine of The Shallows (titled “Search, Memory”), the erstwhile English major in me was reminded of a quote from Faulkner: “The past is never dead. It isn’t even past.” (He also said the much more succinct “There is no was.”)

Faulkner believed that the past was part of us and that we must live with it, though we might try desperately to move on from it.

In many ways, the internet has made Faulkner’s words both more true and more false. Seemingly everything that has ever happened, both in history and in our personal lives, is archived online. Our past is just a click away and we willingly share it – #tbt anyone?

Yet Carr also persuasively argues that the internet makes people rely less on using their brain to store information, with the implication that it changes the way our brains experience memories. Is it even possible anymore for our past to live with us in the same way it did during Faulkner’s time?

And what would Faulkner say about the potential for “downloading” our brains, potentially housing real human memories in a digital medium? To quote Faulkner: “Wonder. Go on and wonder.”

Posted by: John Herman | November 23, 2015

A Childhood of Tablets

Children and education are topics Nicholas Carr touches on in his book, it is interesting to consider how the computer tablet and smart phone are making an impression on both. Computers are widespread in schools, child tablets are being produced and educational apps are numerous. Increasingly, children use these technologies at a younger age and for longer periods of time.

While they are not the Internet, their effects can be equally important.

On US and British news sites, feature articles offer different viewpoints on this issue. One article in the US refers to how the tablet is being increasingly used as a babysitter. Articles from the United Kingdom and India stress the importance of parent’s participation on their children’s ability to read. An article from the Guardian addresses the affects of the device on reading. Another article from Australia argues that children who use tablet devices are less able to use desktop or laptop computers. Yet in countries like South Africa, India and Kenya tablets are seen as a solution and governments are investing large amounts of money in bringing them into the classroom for children.

As Carr points out, the mind is always changing and adapting to new circumstances, how does a child’s mind adapt or change with the introduction of the tablet or the smart phone?  Is it a learning device or a babysitter? How will the verbal and physical disconnect from traditional interaction with adults, peers and physical toys change or not change how their brains develop?

Posted by: tamgalcook | November 23, 2015

Is the internet a distraction?

As I research Benjamin Moore Paints for a paper, I find myself surfing the net for DIY projects. Granted, these projects include paint they are not really applicable to my research. It seems the various links distract me.  They take me away from my primary goal— the mission and vision of Benjamin Moore Paints.

“When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning.” (Carr, Pg. 116)

As I examine this statement, I believe my task to find the information and absorb it becomes secondary to learning.  I wonder around sites and scan over key words: mission, vision, values, profits, competition. While information is accessible via the internet, I detour from my main subject to check Facebook or emails. Possible procrastination happening or is it merely another distraction. I believe it is time to head to Linfield’s library to accomplish my assignment otherwise all I have learned is there are two easy DIY projects I want to tackle in 2016 at the house.

Here is the question I pose you grad students: Is the internet a distraction or can you overcome the distractions to find what you need for your paper?  The following links are great articles that might help you determine if you are easily distracted and what we can do to overcome our addiction…

http://99u.com/articles/6969/10-online-tools-for-better-attention-focus

http://www.nbcnews.com/business/consumer/students-cant-resist-distraction-two-minutes-neither-can-you-f1C9984270

http://www.nytimes.com/2010/11/21/technology/21brain.html?pagewanted=all&_r=0

 

“As people grew accustomed to writing down their thoughts and reading the thoughts others had written down, they became less dependent on the contents of their own memory” (Carr, 2011).

Besides what Carr discussed in The Shallows, a recent psychological study shows that we have a worse memory of things and events that we took photos of. I first heard about this theory from a friend’s guest in our house, when I told her that I’m a photographer. She responded, “so you must have a bad memory then.” Then she explained to me why, but I didn’t believe what she said until today.

Dr Linda Henkel, who conducted the study at Fairfield University in Connecticut, said: ‘When people rely on technology to remember for them— counting on the camera to record the event and thus not needing to attend to it fully themselves — it can have a negative impact on how well they remember their experiences.”

Henkel also indicates that the “mind’s eye” and the camera’s eye are different. When we are looking at things through a viewfinder, our brain is letting the camera remember it for us. So, to recall that part of memory, we have to look and interact with those photos we took, not just let them sit in the Facebook album. 

However, we all are used to capturing moments that are important to us and may think that we will remember it better this way, when in reality we might forget it all in a flash, like what they do in Men in Black.

Neuralizer1

//

//

Reference

Carr, Nicolas. “Search, Memory.” In The Shallows: What the Internet Is Doing to Our Brains, 173. W.W. Norton & Company, 2011.

Henkel, Linda A. “Point-and-Shoot Memories The Influence of Taking Photos on Memory for a Museum Tour.” Psychological Science, December 5, 2013, 0956797613504438. doi:10.1177/0956797613504438.

 

 

Posted by: ggordonliddy | November 23, 2015

Where is my mind?

With the internet essentially at my fingertips 24 hours a day, 7 days a week, I’ve noticed I use my memory much differently than I did before. After reading Clive Thompson’s quote in chapter 9, I’ve realized that I too have given up most of my efforts to remember details and seemingly trivial information. I no longer remember anyone’s phone numbers, addresses or birthday’s, except for a handful of close family members and friends. I let my phone, email accounts or social media do that work for me. I definitely feel like I’m remembering less and less and the years go on, however, I don’t feel like I’m less intelligent than I once was. I now remember the best methods for finding data or researching topics, and the most reliable sources for information. In fact, since I have virtually limitless access to information I feel like I can jump in the stream of facts and understanding whenever and however I choose.

As humankind acquires more and more knowledge, it doesn’t seem like memory as we know it now will even be possible to maintain. I read an interesting study about this topic where the author termed this phenomenon as “digital amnesia.” The author also suggests that our technology actually encourages our brains to forget. Here’s the full article:  https://usblog.kaspersky.com/digital-amnesia-survival/5548/

Posted by: whatbrettsays | November 23, 2015

Humanizing Operating Systems

I couldn’t help but draw parallels between ELIZA, the program developed by Joseph Weizenbaum and discussed in Chapter Ten of The Shallows, and Apple’s Siri program. Anyone that has owned an iPhone made after 2011 (and nearly everyone who haven’t) are familiar with the program known for basically being a knowledge navigator for Apple iOS and web searches.

While Siri doesn’t provide pay-by-the-minute psychotherapy sessions as Carl Sagan had imagined, she is programmed to provide access to a myriad of information both relating to personal data stored on the iPhone, apps that the user has downloaded, and web search results. And if you ask Siri to “tell you a story”, after several times (it took me 7), she will go on to tell a story in which she references ELIZA.

As developers of technology, humans have expressed the interest and desire to personify computer programs, or develop intelligent operating systems. While so many references to intelligent operating systems could be made from 2001: A Space Odyssey to Blade Runner to any of The Terminator movies, this concept was most directly examined in the 2013 movie Her, where Joaquin Phoenix’s character actually develops a romantic  relationship with a computer program. And while the consequence of increasingly humanized computer programs doesn’t result in the destruction of humanity, the operating systems inevitably recognize their exponentially accelerated ability to learn and leave humans behind to continue to explore knowledge.

 

Why do we feel the need to have an emotional connection to a computer program? Is it because we are so directly connected to the internet in our daily lives that we subconsciously think of our relationships with our operating systems as the same as an interpersonal relationship?

Posted by: zachputnam | November 23, 2015

Is Better Technology Making Us Less Tech-savvy?

In Chapter 10 of The Shallows, Carr highlights a Dutch study that showed that the more “helpful” a piece of software was, the less its users really learned about how to use it. “The more that people depended on explicit guidance from software programs, the less engaged they were in the task and the less they ended up learning.” (p 216)

As author and futurist Arthur C. Clarke said, “Any sufficiently advanced technology is indistinguishable from magic.”

We often think of younger generations, who seem to have been born with a touchscreen device in their hands, as being more tech savvy than their analog-aged elders. But some recent studies have shown the surprising result that Millennials are scoring low in technology problem-solving skills. Here’s a sampling of the kinds of tasks they used to test for “tech proficiency,” from this article on CNBC:

Screen Shot 2015-11-22 at 4.07.35 PM

Some, such as this writer at the Wall Street Journal, have theorized that the problem for younger digital natives, who are more accustomed to slickly-designed mobile interfaces than spreadsheets and folders, is that “the very ease at which this information can be obtained has caused them to have a diminished appreciation of the underlying computer science.”

I know that when I was a kid, my dad seemed like a computer genius as I watched him navigate inscrutable MS-DOS prompts to troubleshoot a problem on our enormous beige PC.

Is it possible that the incredible ease of today’s “just-works” technology is actually making us less-tech savvy through obfuscation?

Posted by: theartspj | November 20, 2015

“Social Lights”

When I think of intellectual technologies that once adopted, can never be abandoned, or if absent, can cause ” great confusion, and possibly utter chaos,” my mind immediately connects that idea to the relationship between people and their cell phones.

These little hand-held computers absolutely are an extension of the individuals they belong to, and likewise they have the ability to prohibit us while simultaneously enabling and empowering us.

That notion calls to mind the “Social Lights” series; artist Seymour Templar documented young New Yorkers candidly interacting with their connections using their smart phones. The blue glow emitted from their phones gently lights up their faces, giving us an intimate and honest look at the relationship between the person and her phone, and by extension, ourselves and our phones. When I look at this I see beauty in these frank moments, but I also feel a sense of disappointment at the way we have allowed technology to break the sanctimony of eating dinner or sharing a drink and conversation with friends. Then again, happy hour is so much more fun when you can Instagram it. #CanadianClub #StratComm101

social-lights[3]

Posted by: yourpaltom | November 18, 2015

“being anonymous in public might be a thing of the past”

20151113-lens-selfie3-blog480.jpg

Much of the photography news and criticism in the New York Times is deployed in the Lens Blog, regularly a fascinating read. A Nov. 16 post focused not on the work of and up-and-coming star or an old master, though, but on the lowly selfie.

The article “Who’s Who? The Changing Nature and Uses of Portraits” reminded me of this week’s reading and the Monday presentation on privacy in the digital world. As portraiture and self-portraiture have become ubiquitous, our sense of identity has become intertwined with our public “image.” As Marvin Heiferman states, “Increasingly, we define ourselves by who we show ourselves to be.”

Another point: Most of us have experienced Facebook’s tagging feature, and the ease with which biometrics can identify our friends in photographs. But perhaps we have not given much thought to how governments and other organizations collect records of our biometrics without our knowledge, let alone consent. Facial recognition technology is now being used for more than just security reasons, as companies are secretly data-mining our faces to determine how better to sell to us. We do not own the copyright to our own faces. Thus our self-image, and the image we project to the world, becomes yet another commodity.

“What facial recognition allows is a world without anonymity,” says Alvaro Bedoya of the Georgetown University Law Center. “You walk into a car dealership and the salesman knows your name and how much you make. That’s not a world I want to live in.”

 

« Newer Posts - Older Posts »

Categories