User:Sabub/NMAC 4460 Journal
- 1 August 24, 2019: New Media
- 2 August 24, 2019: New Media Revisited
- 3 September 8, 2019: Foundational Thinkers
- 4 September 8, 2019: McLuhan, the Medium, and the Message
- 5 September 15, 2019: Being Digital
- 6 September 15, 2019: Understanding Hackers
- 7 September 21, 2019: Open-Source vs Proprietary Software
- 8 September 22, 2019: Open-Source, Free Software, and New Media
- 9 September 29, 2019: Read-only vs. Read/write culture
- 10 September 29, 2019: Participatory and Remix culture
- 11 October 4, 2019: Learning through Dream Machines
- 12 October 5, 2019: Education in the Cloud
- 13 October 13, 2019: Cyberspace and the "Real" World
- 14 October 13, 2019: Mixed Reality and New Media
- 15 October 20, 2019: Narratology and Multiform Stories
- 16 October 20, 2019: Narratology vs Ludology
- 17 November 3, 2019: I, Cyborg
- 18 December 7, 2019: Reflection
- 19 References
August 24, 2019: New Media
New media, as I understand it, provides communication, information, and entertainment through digital channels. This would include the internet, computers, smartphones and so on. A specific example of new media that come to mind are social media like Instagram or Twitter, blogs, and even Wikipedia. New media differs from "old media" like television and newspaper since it is interactive and constantly changing. New media allows people to not only receive messages but also put out their own messages for others to see. I think new media is also constantly moving since information can be put out and edited at any time. There is some overlap between old media and new media, examples being online newspapers and television streaming online. I think new media is such a broad term that I feel like it is difficult to identify everything that falls under the term.
August 24, 2019: New Media Revisited
After looking more into new media, it seems my original definition falls into that of Lev Manovich's proposition of "New Media as Computer Technology used as a Distribution Platform". While it is correct to some degree there are quite a few ways to define new media. Particularly, I realized I did not really touch on the studies associated with new media. The study of new media can focus on the objects themselves (like smartphones). However, new media is not only how the digital instruments are used but also the study of how they change the way people act and think. I'm kind of surprised I did not think about the theory and studies associated with new media considering I have taken so many classes related to the subject. I think I got so caught up with past definitions I have read that focus mainly on digital devices that I forgot about the cultural relation. New media is "a cultural process that involves not only the actual transmission of information but also the ritualized collocation of senders and recipients". I mentioned briefly in my previous post, new media allows people to interact more. I did not mention just how much new media allows people to come together, share their ideas and even start movements. Through New media, people have brought back their favorite shows, driven political campaigns and spread movements like #MeToo. New media is just as much about the social aspects as it is about the digital ones. I think it is important that I pay more attention to that since I am a new media student.
- @Sabub: I didn't think of the devices themselves as a form of new media either at first. I liked your second point about how new media and communication go hand-in-hand. That's literally the name of our degree, but I never really thought about it in a sense of going together. We really have taken so many classes on it. I liked what Shannon found in her research. She mentioned that new media is a form of self-expression. If we are looking at new media as a cultural process involving communication, there's an aspect of how we represent and identify ourselves in that process. New media is definitely a powerful tool, but there are ways that the tool shapes the user, too. In the same way that agriculture has made farming less of a necessity for everyone (meaning less people would be able to grow their own food, if you look at it in a negative way), there are ways that new media is changing us that we are beginning to see. I think one negative way would be false information spreading so quickly. Hthrxlynn (talk) 22:06, 27 August 2019 (EDT)
- @Sabub: Reading this opened my eyes to the realization that I have done the exact same thing! It is crazy how comfortable we get with what we are used to that we so easily overlook all the other aspects that can go along with something. New Media is definitely a large reason, if not the entire reason, as to why virtual/digital communication exists and has spread world wide, and why it is so important to society today. It drives the way we start and run our businesses, meet new people, create entertainment and commodities, express who we are, and enjoy entertainment ourselves. With our world so heavily dependent on technology today, new media and communication (the name of our degree, haha) are basically synonymous. As I thought about this more, the numerous ways in which new media has affected my experiences, people I know, and who I have become as a person are endless. Shannamartini (talk) 22:58, 7 September 2019 (EDT)
- @Sabub: After reading your standpoints on new media and how it is distributed, I also think that sometimes we forget when using technology how to communicate with others. For example, from working in a restaurant, I sometimes notice how families and couples who come to eat together will choose to be using technology instead of communicating. Technology can drive us from communicating with individuals as a distraction. New media in the world keeps us so updated with newer things that in a way it shapes who we are. Jameiladudley (talk) 18:45, 8 September 2019 (EDT)Jameiladudley
September 8, 2019: Foundational Thinkers
While surely not the first to attempt to conceive a new age information device, Charle Babbage seemed to become a computer pioneer after he proposed the analytical engine. Though this mechanical general-use computer design never became a reality, it surely inspired future versions of the computer. The same could be said for Vannevar Bush and the Memex. Though only hypothetical, the idea for an endless library of information became the inspiration for later engineers and inventors, notably "Father of the mouse",Douglas Englebart. Bush and his ideas for the memex, in a way, predicted the online encyclopedia and personal knowledge base software, among other things. While these thinkers presented ideas for systems, later foundational thinkers proposed ideas to improve on systems that already existed. J. C. R. Licklider saw the need for easier interaction between the computer and its user. Licklider wanted symbiosis, for man and computer to work together, almost like man and man would. This close relationship between human and computer was also seen by an earlier thinker, Norbert Wiener. Wiener saw a connection between feedback mechanisms in machines and intelligent behavior. It seems that most of these foundational thinkers took examples from the human brain and applied it to computers, and vice-versa when thinking how the two could interact. These foundational thinkers, along with ones I did not mention, have influenced new media through their ideas. Though they did not create the tools we use today, they proposed systems and foresaw possibilities, like interactivity and "human-machine symbiosis" that influenced others to build off of. Had they not pioneered the way, new media might not be what it is today.
September 8, 2019: McLuhan, the Medium, and the Message
Marshall McLuhan's phrase "the medium is the message," says that the medium or the channel itself is more important than the message within it. The medium has more of an effect on us than the content itself. I found this idea incredibly interesting. I have heard this phrase many times before only now really understanding it. The medium we receive information from can affect how we live. A medium can affect us like how architecture helped in the evolution of music or how television made us change how we look at time and how our home is set up. Also, the way we think changes based on the medium. I think an example of how applicable the idea of the medium being the message is social media. The way apps like Twitter are set up affect how we communicate and create messages. Twitter has a 140-character limit for each tweet, so we have to think differently and change how we use language. We shorten our words and use slang to create a message because of the medium we are using. This I believe emerged from texting since the point was to convey a message quickly. Now, this "text speak" is used in everyday life, whether on social media or face-to-face. At this point, we rely pretty heavily on media as a whole to shape how we interact and live. As McLuhan said, technology is an extension of ourselves that we don’t really realize. 
- @Sabub: Savannah, I love what you said about how the medium changes the way we think. I read the article I'm going to be writing my Wikipedia article about, A Rape in Cyberspace, this week. It talked about an online game where one user essentially used his character to violate other people's avatars on the game and what that meant in this online world. It was interesting to actually find myself agreeing with the victims of the situation, mentally comparing it to some early yet intense form of cyberbullying. You should read it if you get the chance! I'd love to hear what you think! Hthrxlynn (talk) 20:17, 8 September 2019 (EDT)
- @Sabub: You make valid points about McLuhan's "the medium is the message" and how it relates to the current medium most people use to communicate. Social media has changed a lot since it's beginning when 32 characters were all an instant message could send on a flip phone. Now with apps like Twitter that as you said to limit the amount of character per post it has changed how people interact on social media as well as in person. Trying to remember all the acronyms for text talk compared to acronyms used can overwhelming. Which links back to McLuhan's the medium really is the message and how we'll communicate a decade or more from now.Amayesing77 (talk) --Amayesing77 (talk) 23:43, 8 September 2019 (EDT)
September 15, 2019: Being Digital
In 1995, Nicholas Negroponte outlined the history of digital technologies in his book, Being Digital. Along with the general history, he also predicted possibilities for the future of these technologies. Many of these predictions ended up coming true in some way or another. Particularly, Negroponte's concept of newspaper's being tailored to our specific interest is something that has been becoming more common over time. The site DailyMe allows just this as users set up their own front pages with articles that fit their interest. Also, social media could also be applied to this idea. People subscribe only to who they want so they only get the "news" they are interested in.
Negroponte's comments on the touchscreen in his book and even earlier in his ted talk, interfaces also seem to have been a prediction. While interfaces like the mouse are still used, the touch screen has stretched to phones, computers, and more. I think what is most impressive about Negroponte is how he saw digital technologies becoming a teaching tool, whether it be through games or accessing information. Negroponte identified this possibility both in his Ted talk and his book, and like many of his other predictions, it became a reality through the One Laptop per Child project.
September 15, 2019: Understanding Hackers
My understanding of hackers after reading The Hacker's Manifesto is that hackers do not intend to do harm. Their main reason for hacking is to use technology to explore and expand on what they already know. It is curiosity, not ill intent, that motivates them. The negative aspects that are usually attributed to hackers are actually brought upon by crackers. I personally used to put the two together in my head, even though "hackers build things and the crackers break the things".
Though I am still a bit hazy on what "being digital" really means, I think hackers are a good example of the idea. I think being digital means using digital technology in both new and beneficial ways. It is finding ways to improve on what we already have. Hackers are a great example because they take note of what can be improved and how to protect what they are working on from crackers. Catherine Bracy has a very good quote on hacking in her Ted talk. "It's the idea that if you see a problem, you work to fix it, and not just complain about it". I think the hacker's mentality is useful to everyone when it comes to being digital so we can keep learning and improving in technology (and possibly the world).
- @Sabub I think what we view as hacking today is seen as digital but when Bracy was talking about Benjamin Franklin, she was pointing out that people don't have to have advanced technology to be hackers. And I also would put hackers and crackers together in my head. I guess that is because of the information we are exposed to about hackers. Whenever someone reports on some credit card company that has had a security violation where information on their clients is stolen, the headline usually includes the word hacker. I wonder if the people in media reporting on these type of things know there is a difference between hackers and crackers. --Jkoplin1 (talk) 18:57, 15 September 2019 (EDT)Jkoplin1
- @Sabub: I thought it was interesting to learn that a hacker is not someone wanting to do any harm. When we watch television and movies, the hacker is someone who seems to be the one who is out to take advantage of the system to harm people and do things like break into your bank account. Learning that hackers can actually do some good has given me a new perspective on them. Now that I have learned this new information, I think that hackers are necessary. They can help point out the flaws in a system before crackers can take advantage of them. MGray1196 (talk) 21:08, 15 September 2019 (EDT)
September 21, 2019: Open-Source vs Proprietary Software
While watching Revolution OS, I realized I have never thought too deeply about proprietary software versus open-source. I use proprietary software every day since I use Microsoft Windows and Adobe Photoshop, due to industry standards, even though I could probably find a similar open-source program. However, I use open-source software as well, like MediaWiki and Blender. I also use the Android operating system, which I just found out is based on Linus Torvalds' Linux kernel, though it isn't completely open source.
New media is littered with both proprietary and open-source options. While I use both in my daily life, I can't help but find myself seeing more benefits from open-source software. As Richard Stallman and the others say in the documentary, users should have the freedom to use the software how they want.When users have freedom, the software itself benefits through community contribution and evolution.Although I understand wanting to keep software proprietary because you worked hard to create it yourself, it can be both greedy and detrimental to the future of technology and new media, when companies keep it secret to make money. How are we supposed to improve upon something if it is kept a secret? I think that the idea of the open-source movement applies to any type of media. If we show how something is created, we can build upon it and learn from it to make new things.
- @Sabub: Good job with your short refs, but your bib needs to be in alphabetical order. Please see my feedback that I will be updating through the day on 9/24/19. —Grlucas (talk) 14:37, 24 September 2019 (EDT)
September 22, 2019: Open-Source, Free Software, and New Media
Like many people, I previously thought the open-source software and free software were the same things. While they stem from the same roots and share many of the same guidelines when it comes to sharing source code, they have different views. Where free software refers more to the user's freedom, open-source is focused on practicality.  I think both views are important when thinking about new media.
Eric S. Raymond's proposition "given enough eyeballs, all bugs are shallow" made it easier for me to understand how beneficial showing source code to the public is. When more people have access to the source code, they can make corrections and open up a marketplace of ideas, which makes it more practical for the software developer. The term open-source itself is also good for making the idea more digestible in the corporate world, case in point Netscape's coining of the term when releasing their source code. I think that when major companies adopt this practice of releasing their source codes, it not only benefits them but also the future of open-sourcing.
I find that the idea of "free software", or I suppose specifically Richard Stallman's views, is also very true. Freedom is important because it fosters participation and community. Not just when creating software, but also various forms of media. Restricting information doesn't do much good for innovation. It takes longer for things to be improved when you only let a few people see. Based on my understanding of what new media is, we need collaboration, to be critiqued, in order to create or improve it. So following the way of the open-source movement seems like a no brainer.
September 29, 2019: Read-only vs. Read/write culture
Lawrence Lessig argues that we do not need to sacrifice Read/write(RW) culture in order to protect Read-only(RO) culture or vice-versa. The two can and should coexist. In fact, to create remixes, RO culture is needed since that is what remixes are made up of and is something we can learn from. However, RW culture is also valuable to us as it serves as a creative opportunity for us to take what we learn and make our own products. Both cultures are important, but it seems that there is a large focus on protecting RO while disregarding and diminishing RW, through copyright laws. Lessig proposes that the way to also protect RW culture is to decriminalize it. Regardless of whether these laws are in place or not, RW culture will still exist. But if it continues to be considered a crime, younger generations will be taught they are criminals and that remixing, along with the creativity that comes with it, is a bad thing. Current laws should not be applied to remix culture because there is nothing harmful about it.
I found myself agreeing wholeheartedly with Lessig's argument. As someone who has grown up in remix culture, I have never seen it as harmful. It is great to enjoy and learn from work created by professionals, but it is also a great experience to create or view remixed work. It gives me an opportunity to see a piece of work from another perspective I never thought of and witness grassroots creativity, as Henry Jenkins puts it, at it finest. When we treat practices that foster creativity like it is wrong, it can make things pretty boring.
- @Sabub: I agree with you about how people treat RW culture. Some people will look at it in a negative way but the fact is it does encourage people to be creative. Now I will say too much remix can get annoying. If we look at how Disney is redoing almost every film that made them a bit of money, then yes, repetitive remixing can become dull and lose some of the original quality. But using RW culture as a mean of inspiration isn't a bad thing. --Jkoplin1 (talk) 14:38, 29 September 2019 (EDT)Jkoplin1
- @Jkoplin1: I was actually thinking about all the Disney remakes while reading about remix culture. I kind of distance it from "typical" remix culture because, while they are using the same story, they are still making everything else from scratch. Also, I see these movies as more of a money grab, rather than a creative reimagining as they call it. So I'm not sure how much I would consider these as a part of the remix culture. Sabub (talk)
September 29, 2019: Participatory and Remix culture
When it comes to the hacker ideology, open-source software, or participatory and remix culture, they all have a common relation to new media. They all deal with utilizing the information that came before and collaborating in order to further said information. They all have similar goals, as well as similar obstacles. Participatory and Remix culture does not aim to steal and ruin an author's work, just as hackers and software sharing does not. However, it seems that many see it that way. Specifically, Andrew Keen argues in his book The Cult of the Amateur that sites like Wikipedia and Youtube, that thrive from contributors, are full of ignorance and absurdity . He believes that we should focus on mainstream media since they have better sources instead of sites where contribution is prominent.
First, I wonder what he thinks about being on Wikipedia himself? Second, I wonder how we are expected to learn and grow if we do not take part in the creation process? I don't see how just being a part of a passive audience helps anyone. Especially since mainstream media isn't perfect. In order to develop media literacy, we need to practice it. I know I learned how to edit videos and photos through remixing.
Also, continuing a bit further on the argument against participatory and remix culture, the true author/artist is not always being looked out for. I think you should always support an artist since they do put work into their creations. However, that support doesn't always reach them directly. As Courtney Love points out, artists do not make all of the money from song sales, the labels take most of it. So when people use the argument that you are hurting the artist, whether it be monetarily or artistically, I don't think it is the strongest argument. If someone rips the song from a fan video you make, they probably weren't going to buy it anyway. That video also probably isn't going to taint the original meaning of the song. The artist is probably more concerned with the fact they don't own the song they wrote.
October 4, 2019: Learning through Dream Machines
After viewing the assigned texts, I found they supported my idea that a large part of new media is really about learning and collaborating through digital technology with the goal of improvement. Ted Nelson's Computer Lib/Dream Machines tells us that "being digital" is really just using a digital device as a tool to benefit learning and thinking, among other things. Sugata Mitra also expresses this idea in his Ted talk. By having access to a device, anyone has the ability to "be digital" by seeking out the information for themselves. I think both employ the hacker spirit since they encourage exploration for information.
While going through these texts, I found myself relating to all the points made about seeking information. I have never really had a problem learning how to use technology, and I usually learn how to use it on my own. I am the person in my family that everyone comes to when trying to figure out what is wrong with their devices. I am certainly not all-knowing, though. So I do reach out for help when I need it, whether it is through searching through a browser or asking someone I know. I often discuss with Heather (Hthrxlynn) about our courses, including this one, to see what can be understood and done better. As Mitra showed through his "hole in the wall" experiment, anyone can learn by just looking for information. I know that is how I have acquired much of the knowledge I have. I surely haven't retained the information that was weakly crammed in my head when I was in high school. However, when I actually explore for myself and collaborate with others, I gain so much more from the experience.
- @Sabub:I appreciate all of your help every semester, too! I agree with you about the information retention. I have definitely learned a lot more in my discussion-based classes where we have to collaborate with other students and do our own research than I ever did in my lecture-based high school classes! Hthrxlynn (talk) 03:10, 5 October 2019 (EDT)
- @Hthrxlynn: I feel like I am still struggling with the "newer model" of learning since not everyone has adjusted as we have to it. It is hard to have a discussion when no one wants to say anything, so I get why professors get annoyed.
- @Sabub: Your posts have been super helpful to all of us who struggle here, so massive thank's for that! I also agree with you by learning by yourself and digital media being a tool. Before what initially came to mind with digital media was just social media and non-education, after this lesson though, I find digital media to almost be indefinable by only naming its parts.Kyannayeager (talk) 23:56, 6 October 2019 (EDT)
- @Kyannayeager:I'm happy to help anyone if I can! And you are right. New media is certainly hard to define since there are so many aspects to it!
October 5, 2019: Education in the Cloud
I have noticed that a big discussion around digital media and its relation to education is whether it is overused or could be used more. Some advocate for almost no screen time at home or school since they find children become too reliant on them. While in places like China, AI education is being tested, and physical teachers are seemingly being pushed into passive roles. I can't help but wonder why we have to think of digital media or new media in extremes? Why can't we use it for, what I think, it is made for? To aid us and make things easier. We should have a healthy balance of new media as well as "real world" interaction.
Of course, this has been attempted with education reform models. Looking from an educator's perspective, Dr. Lucas mentions in his Medium post that he has approached teaching in this way. However, it doesn't always work well because students are trained to adhere strictly to factory-based education. I can attest to this from a student perspective. I had my first taste of a discussion focused classroom when I was in high school. I hated it. I didn't have the experience again until I started taking my major focused classes in college. Again, I hated it because I thought, "why can't they just tell me what I am supposed to do?". That was until I realized that I was actually learning, not memorizing, and thinking more creatively than I had before. While it was a great, eye-opening experience, I wish I could have had it earlier. So needless to say, I think that the education system is due for an overhaul.
This is where new media plays in. As I said before, my understanding of new media is that it is a resource for information and interactivity. A resource that can and does greatly benefit education. Something I noticed in many of the suggested sources provided is that personalized education that is highly desired by students. Sir Ken Robinson says that transforming education is "about customizing to your circumstances and personalizing education to the people you're actually teaching." I think digital media can play a huge part in that since we have the freedom to use it however we want to find the information we think fits our understanding best. New media, in conjunction with teachers being facilitators rather than just lecturers, can have a great effect on education. It is just whether we are willing to embrace it or not, that will determine where education leads.
- @Sabub: Very well said. I have been in the same rinse and repeat education and even though I like open discussion it always seemed like my answer to the open discussion was still wrong. Even though it was an open discussion it didn't match the exact language to the question the professor had asked and wanted a book answer under the guise of an open discussion. I watch Sir Ken Robinson's TED talk and what he said made absolute sense but I'm not sure there is a clear agreed-upon way to get there.--Amayesing77 (talk) 00:34, 7 October 2019 (EDT)
- @Sabub: I'm honored to have been cited. Great work! I’ve always struggled with alternative ways of teaching for this very reason: students have been trained to be students in the way Mitra explains. This is difficult, if not impossible, to break. —Grlucas (talk) 11:28, 7 October 2019 (EDT)
- @Sabub: Totally agree with you in the education aspect its literally the same thing over and over again. I love that there are new ways being presented to us. Im super excited about what the future holds because my kids will be apart of that future. New media is forever evolving and im happy to be apart of the evolving process. VincentH81 (talk) 13:46, 7 October 2019 (EDT)
October 13, 2019: Cyberspace and the "Real" World
I think it is difficult to decide what exactly is real and what is not when it comes to cyberspace and virtual interactions. In games like Second Life, the physical may not be considered real, as it is all on a computer, but what about the emotional aspects? Though some may not consider these interactions real because they aren't done in person, I personally don't think digital interactions make them any less real. We are still connecting to other people, just through a digital medium as opposed to a flesh and blood one. At this point, I think that rather than the physical world being real and the digital one being fake, the two have overlapped with one another. In virtual worlds like Second Life, you can carry out real-life tasks like owning a business or selling property for financial gain in the physical world. Meaningful relationships can be made, even if they harm other relationships, just like in "real life".Some have even brought their strictly digital relationships into the physical world as seen in the documentary, Second Skin. And unfortunately, crime and assault, like cyberrape, are still possible and while it may not take a physical toll it can still have a great emotional effect.
After viewing the assigned texts, I saw again just how much "being digital' is a part of our lives. Being digital allows us to communicate and interact with people who we relate to, as well as express who we are in ways we may not be able to physically. We form emotional connections not only to the people we meet but also to the characters we create. Though, I have not built as strong relationships in cyberspace as some of the people in the texts have, even I have immersed myself in virtual worlds. I've done this as both a way to experience new people, as well as to escape from one life to another.
October 13, 2019: Mixed Reality and New Media
Unfortunately, Second Life decided to give me a hard time, so I couldn't create an avatar. However, that gave me a chance to look deeper into different aspects of cyberspace.
As I said in my previous post, I see cyberspace and the "real world" as being overlapping, or at least parallel. I don't necessarily think that one is more real than the other. And this isn't really a new concept. I didn't really know much about MUDs before, save for my limited knowledge of text-based gaming. However, MUDs were definitely the root of a lot of the virtual worlds we have today. People were able to create their ideal selves through them while interacting with others. The same can be done on platforms like Second Life. People can live without daily restraints temporarily then go back to everyday life. And now, more recently, Virtual Reality can really make you feel like you are in a different world. The development of mixed reality also furthers my belief that these different spaces have become merged since it overlays artificial reality onto the physical.
The suggested readings did not mention social media, but I consider it to also be a way for both cyberspace and our "real" lives crossover. Augmented reality has become a part of social media (think filters on Snapchat and Instagram). There is also a similar portrayal of identity as there is in platforms like Second Life. Though people forgo the avatars and present some of their real lives, they may also present what they idealize their lives to be like even if it isn't real. Identity is the common denominator here regardless of the platform.
I think cyberspace and VR encompass a lot of what new media is. They allow for communication, collaboration, and creativity since they involve the creation and development of different worlds. While developing those world we also get to present ourselves in ways that we might have never thought of before. Just as we live in our physical world, we do the same within new media. We just do it over digital technology.
- @Sabub: I like how you pointed out the fact that the virtual worlds are technically real as well. They can be just as important to an individual as their own real life, for within the virtual realm we have even more control over ourselves in terms of appearance, location, etc; not to mention they can be whatever we want them to be, wheareas in real life, sometimes we have to deal with the cards we are dealt, so to speak. Shannamartini (talk) 23:06, 13 October 2019 (EDT)
- @Sabub: I agree with social media crossing over into our real lives that it is a similar portrayal of identity. We get so wrapped up into social media and sometimes we don't know how to disconnect from it. In a way, being able to have that digital technology in our hands it gives us a brief moment of freedom from reality. Armond.trice (talk) 23:33, 13 October 2019 (EDT)
October 20, 2019: Narratology and Multiform Stories
Janet Murray uses the term ‘multiform story’ in Hamlet on the Holodeck to describe a narrative with a single plotline that can play out in multiple versions, with each version being mutually exclusive from the others. This narrative form is a newer way of departing the typical linear format in media to explore multiple possibilities in a story. As Murray points out a multiform story can reflect different perspectives of an event and how we look at our own reality. Multiform stories are important to narratology because they give a new way of perceiving and interacting with a story. This idea of multiform stories has been seen in mediums like films like It’s a Wonderful Life and the recent interactive film Black Mirror: Bandersnatch.
However, I think this story form is very prominent and well used in video games particularly. I would argue that narrative is important in video games as the actions taken in them are often used to advance the story, whether it be role-playing or an adventure game. These stories within the games often fit into the multiform narrative because players must make decisions, and with each decision a new version of the story is created. The next time a player plays the game they can create a new version of the game by picking different choices. A good example of a game with a multiform story is Until Dawn, a horror game based on cause and effect where each choice determines who lives, who dies, and what ending you will get.
October 20, 2019: Narratology vs Ludology
In my previous journal post, I touched a bit on narratology and how narrative is present in video games. I now want to discuss both narratology and ludology since there is a lot of debate concerning which is more important when studying video games. First, I would like to preface my own experience. I am not much of a video game player since making decisions in short amounts of time makes me anxious. However, I am an avid video game watcher. I like to watch other people play the games and follow the stories attached to them. For example, I like watching the YouTuber Markiplier play through games like Fran Bow, a dark fantasy adventure game with a really interesting story.
Based on my own experiences with video games I definitely lean towards the narratology argument. I usually think of video games as a storytelling medium. However, even though my mind defaults to this, I don't think that this is all video games are. Henry Jenkin's accurately points out that not all games require a narrative element, and in turn video games aren't just a storytelling medium. I enjoy the game Dots, a mobile puzzle game where you match colored circles. No story is attached, there is just a focus on gameplay. In the article Genre Trouble, Espen Aarseth says simulation, or the gaming mechanics, is the key to video games. I will not argue that simulation is important, considering they are necessary for games to exist and for users to interact with them. And, in fact, that is what attracts many players, not the plot, but the play.
While I do lean more into the narrative associated with games, I do think ludology is necessary as well. Whereas books and films are mostly interpreted, games are approached through configurative practices. The way we interact with games is different from how we view other media. So it would be a disservice to only focus on one aspect of games when they contain many elements. Simply put, I think both narratology and ludology should be applied when studying video games.
The debate surrounding ludology and narratology represents many challenges that appear in new media studies. As innovation continues and new forms of media develop, different perspectives and ideas appear. There is no one, right way of looking at new media. We must look at all the information and make our own decisions on how we think about it.
- @Sabub: I am also more on the side of narratology with video games. Even though not all video games are required or show narratology, it is easy to say that most do. It's so interesting how the outlook on video games is for me now after learning about the terms narratology and ludology, and like you said it should be applied to the study of video games. There is no way to not consider that video games have become their own little story to tell in the technology world today. Jameiladudley (talk) 21:54, 20 October 2019 (EDT)Jameiladudley
- @Sabub: I also think that some video games are more of a storytelling medium. Especially games who are driven off of story such as Kingdom Hearts. Those role playing games take you on a journey through the perspective of the main character and in order to progress, you have to complete pieces of the story that are missing. There are games that do not necessarily have story telling as a medium but like you mentioned, my mind often defaults to that thought. MGray1196 (talk) 23:36, 20 October 2019 (EDT)
November 3, 2019: I, Cyborg
When I think of cyborgs, my mind initially goes to something metal, cold...inhuman. However, looking at the definition and the resources provided, I realized that it isn't quite right. Really, we are cyborgs. Today, the concept of transhumanism is a bit more realistic considering technology has become an extension of ourselves, both outside and inside our bodies. Though I think cyborgs are typically thought of such when technology is intertwined with organic material, I think the idea can still be applied to how we interact with technology. Smartphones and watches can track our health, and our devices can basically read our minds as if they are connected. And, now we are entering even deeper into life as cyborgs with continued technological advances. Whether it be a pill that supposedly extends life or an implant that can help people with paralysis feel again, technology has been used a lot in the medical field. There is even a microchip that could possibly prevent physical afflictions like strokes, mental illness, and overall boost intelligence. The integration of technology isn't limited to what the big issues faced in life. A Wisconsin company gave its employees microchips so they could easily scan their hands to get into their offices, log in to computers, and get food from the company cafeteria.
Now, we are literally ‘being digital’. We have developed technology to the point that we are one with it. I think this relates to new media in a big way. New media is the media that has been innovated by technology. The same has been done to humans. Where our organic bodies may only be able to teach a certain level, technology can take them to the next. I guess in a way we could be compared to traditional media, at least in my mind.
- @Sabub: Something that really stuck out to me that you said was that we were 'becoming one' with technology. We are becoming almost literally digital. Kyannayeager (talk) 22:45, 3 November 2019 (EST)
- @Sabub: I also think the opposite of the fact we as humans are the cyborgs when I hear the term. It almost makes me wonder if being a cyborg is what science fiction wanted for humans by "being digital".Jameiladudley (talk) 00:22, 4 November 2019 (EST)Jameila.dudley
December 7, 2019: Reflection
I'll be honest when I wasn't looking forward to this class. I hadn't heard the greatest things about the experiences of people who had previously gone through the course. However, I was pleasantly surprised by my experience. While I struggled and stressed a bit through the course, I also learned quite a bit about new media that I didn't know before. And while this wasn't the first time I have worked on Wikipedia; however, I was able to improve on collaboration as well as learning information on my own.
At first, I was a bit troubled by our weekly journal posts. I was used to specific directions on what to post, a prompt question that we would need to answer. Instead, prompts were much broader, allowing us to be freer in our writing. Since I am so used to this type of education, as I state in my Education in the Cloud post, I felt a bit annoyed. I wasn't sure what exactly was wanted from me. However, as I continued to write the posts and become more interested in the material, I became more comfortable. In fact, I really enjoyed sharing my perspectives on each week's subject. I think the only thing that would have made the journal posts more interesting would be more collaboration and communication between peers. It would have been cool to have deeper discussions about the material, instead of everyone agreeing with each other. I am guilty of doing this myself, but the times I did ask questions, I never really got any answers. Of course, this is just another learning experience for me to put myself out there more and start conversations myself.
As a soon-to-be graduate, I thought I knew quite a bit about new media and communications. However, once I started this course, I realized there was a lot I didn't know. That is kind of sad considering it is my major. I was happy to learn more about the theory and the big names, as well as reviewing what I had learned previously. Learning (and re-learning) everything reminded me why I am a new media major. This is the stuff I find interesting! New media and 'being digital' is a part of almost every facet of our lives. The way we create and display media can say a lot. And collaboration and sharing of ideas have helped it evolve so much. That is what I want to be a part of, so I'm glad I could learn about texts and ideas I never heard of before.
As I stated before, I have worked on Wikipedia in a previous class (the Internationtional Cherry Blossom Festival to be specific). That gave me a great base for working on my final article. I already knew how to format everything, and the basic coding involved, so that was a great help to me. I also picked a topic that I was interested in, the book Computer Lib/ Dream Machines, so I enjoyed researching it. However, I still struggled a bit through the editing process.
When contributing to my article I first made sure to find sources to back up the information that was already there. I then decided to add information to sections that were apart of the article already. This included the synopsis, format, and neologisms sections. I added information that I thought would be beneficial to anyone who was trying to learn about the book. I also broke up the synopsis into two sections, to better describe the sections of the double-sided book. In the neologisms section, I added a few more terms I found were the most well-known, as well as definitions for them.
As for the format section, I gave a bit more of an explanation of how the book looked. Since it is such a unique text, I thought it would be important to include. I wanted to include pictures of both covers and page layouts, but I didn't want to risk posting copywritten content. So, I thought an explanation was the next best thing. When adding new sections, I thought a background and a legacy section so there would be more context to when the book was written as well as its influence. I found sources to support these sections, so they would remain neutral.
I believe overall, I added about 11 sources. I think that I added quite a bit of information that raised the quality of the article a bit. Of course, since the book is full of so much information and so well known, there is still a lot that needs to be added. I wish I could have added more, but a problem I had was that a lot of the texts I read said the same thing about the book. They would mainly refer to the book's unique format, it being the first book about personal computers and freedom, and its relation to Project Xanadu. When looking at how exactly it influenced us today a lot of blogs and websites, I was not familiar with came up, so I didn't feel comfortable including it. Perhaps I was not searching in all the right places. I'd like to continue adding to the article even after the course is over. Hopefully, I can add even more substantial information. I'd also like to see more people editing it since only so much information can be added by one person.
Overall, I think this course was a success, at least for me! I know not everyone had the same thoughts when it came to this course, but I thought it was a great opportunity. It was great preparation for the "real world" since there weren't explicit guidelines, and the public can see my contributions. I actually wish this course was a lower-level course, as it would be beneficial to learn how to teach ourselves earlier in our college careers. Also, adding similarly structured courses would be great. I know I would have appreciated knowing this stuff earlier. That being said, there were some struggles, but that is to be expected in anything you do. I was able to learn from my mistakes, which only made me improve in the long run.
- @Sabub: Well done, and i share your sentiment about getting this kind of class earlier. While I do teach this in NMAC 3108, I think sections of ENGL 1101 and 1102 would be the place to start. Very good work. —Grlucas (talk) 08:26, 8 December 2019 (EST)
- Manovich 2001, p. 10.
- Lucas 2013.
- Gitelman & Pingree 2003.
- Brown 2018.
- Bush 1945.
- Licklider 1960.
- Wiener 1945, p. 69.
- Mcluhan 2003, p. 203.
- McLuhan 2002, p. 71.
- Negroponte 1995, p. 25.
- Negroponte 1984.
- Wakefield 2014.
- The Mentor 1986.
- EDUCBA 2016.
- Bracy 2013.
- Relph-Knight 2012.
- Noyes 2010.
- Villiers 2018.
- Raymond 1998.
- Kornblum 1998.
- Stallman 2009.
- Lessig 2008, p. 90.
- Lessig 2006, p. 131.
- Jenkins 2008, p. 90.
- Keen 2007, p. 4.
- Jenkins 2008, p. 170.
- Lessig 2006, p. 80.
- Love 2000.
- Nelson 1974, p. 306.
- Mitra 2013.
- Dangwal et al. 2005, p. 56.
- Bowles 2018.
- Hao 2019.
- Lucas 2015.
- Falk 2017.
- Robinson 2010.
- Motschnig-Pitrik & Holzinger 2002, p. 164.
- Films Media Group 2007.
- Alter 2007.
- Piñeiro Escoriaza 2008.
- Dibbell 1993.
- Turkle 1994.
- Jamison 2017.
- Kelly 2016.
- Bullock 2018.
- Murray 1997, p. 30.
- Murray 1997, p. 36.
- McMahan 1999, p. 149.
- Ostenson 2013, p. 72.
- Jenkins 2004.
- Aarseth 2004.
- Moulthrop 2004.
- Frasca 1999.
- Wallace 2016.
- Nutt 2016.
- Dwoskin 2016.
- Metz 2018.
- Aarseth, Espen (2004). "Genre Trouble: Narrativism and the Art of Simulation". In Wardrip-Fruin; Harrigan (eds.). First Person. p. 45–55.
- Alter, Alexandra. "Is This Man Cheating on His Wife?". Wall Street Journal. Technology. Retrieved 13 October 2019.
Alexandra Alter on the toll one man's virtual marriage is taking on his real one and what researchers are discovering about the surprising power of synthetic identity.
- Bowles, Nellie (October 26, 2018). "The Digital Gap Between Rich and Poor Kids Is Not What We Expected". The New York Times. Retrieved 5 October 2019.
America's public schools are still promoting devices with screens — even offering digital-only preschools. The rich are banning screens from class altogether.
- Bracy, Catherine (2013). "Why good hackers make good citizens". Retrieved 15 September 2019.
- Brown, Dalvin (2018). "19 million tweets later: A look at #MeToo a year after the hashtag went viral". USA TODAY. Retrieved 25 August 2019.
- Bullock, Lilach. "AR And Social Media: Is Augmented Reality The Future Of Social Media?". Forbes. Retrieved 13 October 2019.
- Bush, Vannevar (1945). "As We May Think". theatlantic.com. Retrieved 8 September 2019.
- Dangwal, Ritu; Jha, Swati; Chatterjee, Shiffon; Mitra, Sugata (June 2005). "A Model of How Children Acquire Computing Skills from Hole-in-the-Wall Computers in Public Places". Information Technologies and International Development. 2 (4): 41–60. doi:10.1162/154475205775249319.
- Dibbell, Julian (1998). "A Rape in Cyberspace". My Tiny Life: Crime and Passion in a Virtual World. New York: Owl. p. 11–30. ISBN 0805036261.
How an evil clown, a Haitian trickster spirit, two wizards, and a cast of dozens turned a database into a society.Dibbell's classic article about LambdaMOO.
- Dwoskin, Elizabeth (August 15, 2016). "Putting a Computer in Your Brain Is No Longer Science Fiction". The Washington Post. The Switch. Retrieved 3 November 2019.
- Falck, Libby (March 16, 2017). "6 Tips on the Future of Learning from Actual Teenage Exponential Thinkers". Singularity U. Medium. Retrieved 5 October 2019.
- Films Media Group (2007). "You Only Live Twice: Virtual Reality Meets Real World in Second Life". fod.infobase.com.
- Frasca, Gonzalo. "Ludology meets narratology: Similitude and differences between (video) games and narrative". ludology.typepad.com. Retrieved 20 October 2019.
- Gitelman, Lisa; Pingree, Geoffrey (2003). "What's New About New Media?". web.mit.edu. Retrieved 25 August 2019.
- "Hackers vs Crackers: Easy to Understand Exclusive Difference". EDUCBA. 16 July 2016. Retrieved 15 September 2019.
- Hao, Karen (August 2, 2019). "China has started a grand experiment in AI education. It could reshape how the world learns". MIT Technology Review. Retrieved 5 October 2019.
In recent years, the country has rushed to pursue “intelligent education.” Now its billion-dollar ed-tech companies are planning to export their vision overseas.
- Jamison, Leslie (December 2017). "The Digital Ruins of a Forgotten Future". The Atlantic. Technology. Retrieved 2018-08-15.
Second Life was supposed to be the future of the internet, but then Facebook came along. Yet many people still spend hours each day inhabiting this virtual realm. Their stories—and the world they’ve built—illuminate the promise and limitations of online life.
- Jenkins, Henry (2006). Convergence Culture: Where Old and New Media Collide (PDF). New York: NYU Press. ISBN 0814743072.
- Jenkins, Henry (2004). "Game Design as Narrative Architecture". In Wardrip-Fruin; Harrigan (eds.). First Person. p. 118–130.
- Keen, Andrew (2007). The cult of the amateur : how today's internet is killing our culture (1st ed.). Doubleday/Currency. ISBN 0385520808.
- Kelly, Kevin (April 2016). "The Untold Story of Magic Leap, the World's Most Secretive Startup". Wired. Retrieved 2018-08-15.
The technology forces you to be present — in a way flatscreens do not — so that you gain authentic experiences, as authentic as in real life. People remember VR experiences not as a memory of something they saw but as something that happened to them.
- Kornblum, Janet (1998). "Netscape sets source code free". CNET. Retrieved 22 September 2019.
- Lessig, Lawrence (2008). Remix: Making Art and Commerce Thrive in the Hybrid Economy. New York: Penguin. ISBN 1594201722.
- Licklider, J. C. R. (1960). "Man-Computer Symbiosis". groups.csail.mit.edu. Retrieved 8 September 2019.
- Love, Courtney (June 14, 2007). "Courtney Love Does the Math". Salon. Retrieved 2018-08-18.
The controversial singer takes on record label profits, Napster and 'sucka VCs.'
- Lucas, Gerald (March 28, 2015). "The Liberal Arts Are Dead". The Synapse. Medium. Retrieved 5 October 2019.
Should we educators just face the music and accept the fact higher ed is now just for job training?
- Lucas, Gerald (December 23, 2013). "New Media". grlucas.net. Retrieved 25 August 2019.
- Moulthrop, Stuart (2004). "From Work to Play". In Wardrip-Fruin; Harrigan (eds.). First Person. p. 56–70.
- Manovich, Lev (2001). New Media from Borges to HTML (PDF). The MIT Press. pp. 13–25. Retrieved 25 August 2019.
- The Mentor (1986). "Hacker's Manifesto". phrack.org. Retrieved 15 September 2019.
- McLuhan, Marshall (2002). "The Gadget Lover: Narcissus as Narcosis". In Spiller, Neil (ed.). Cyber Reader: Critical Writing for the Digital Era. Phaidon Press. pp. 69–74. ISBN 0714840718.
- McLuhan, Marshall (2003). "The Medium is the Message". In Wardrip-Fruin, Noah; Montfort, Nick (eds.). The New Media Reader. Cambridge: The MIT Press. pp. 203–209. ISBN 0262232278.
- McMahon, Alison (1999). "The effect of multiform narrative on subjectivity" (PDF). Screen. 40 (2): 146–157. Retrieved 20 October 2019.
- Metz, Rachel (August 17, 2018). "This company embeds microchips in its employees, and they love it". MIT Technology Review. Retrieved 3 November 2019.
- Mitra, Sugata (February 2013). "Build a School in the Cloud". TED Talks. Retrieved 4 October 2019.
- Motschnig-Pitrik, Renate; Holzinger, Andreas (2002). "Student-Centered Teaching Meets New Media: Concept and Case Study" (PDF). Educational Technology & Society. 5 (4): 160–172. ISSN 1436-4522. Retrieved 5 October 2019.
- Murray, Janet H. (1997). Hamlet on the Holodeck. New York: Simon & Schuster. ISBN 0684827239. A seminal work theorizing the cyberbard who can bring to bear yet-known talents in crafting the digital expression that will define our time.
- Negroponte, Nicholas. "5 predictions, from 1984". Retrieved 15 September 2019.
- Negroponte, Nicholas (1995). Being digital (1st Vintage Books ed.). Vintage Books. ISBN 0679762906.
- Nelson, Ted (1974). "Computer Lib / Dream Machines". In Wardrip-Fruin; Montfort (eds.). NMR.
- Noyes, Katherine; Linux, PCWorld; News, Open-Source (2010). "10 Reasons Open Source Is Good for Business". PCWorld. Retrieved 21 September 2019.
- Nutt, Emily Ellis (October 13, 2016). "In a Medical First, Brain Implant Allows Man to Feel Again". The Washington Post. To Your Health. Retrieved 3 November 2019.
- Ostenson, Jonathan (July 2013). "Exploring the Boundaries of Narrative: Video Games in the English Classroom" (PDF). The English Journal. 102 (6): 71–78. Retrieved 20 October 2019.
- Piñeiro Escoriaza, Juan Carlos. (Director) (2008). "Second Skin" (video). Retrieved 13 October 2019.
- Raymond, Eric S. (1998). The cathedral & the bazaar : musings on Linux and open source by an accidental revolutionary (1st ed.). O'Reilly. ISBN 1-565-92724-9. Retrieved 22 September 2019.
- Relph-Knight, Terry (2012). "FOSS v proprietary software: image editing". ZDNet. Retrieved 21 September 2019.
- Robinson, Ken (February 2010). "Bring on the Learning Revolution". TED Talks. Retrieved 2014-08-18.
- Stallman, Richard (2009). "Viewpoint Why "open source" misses the point of free software" (PDF). Communications of the ACM. 52 (6): 31. doi:10.1145/1516046.1516058. Retrieved 22 September 2019.
- Turkle, Sherry (1994). "Constructions and Reconstructions of Self in VR". In Spiller (ed.). Cyber Reader. p. 208–214.
- Villiers, Marilyn de (2018). "Open source vs free software: what's the difference?". ITWeb. Retrieved 22 September 2019.
- Wakefield, Jane (2014). "Ted 2014: Negroponte on plans to connect last billion". BBC. Retrieved 15 September 2019.
- Wallace, Benjamin (August 23, 2016). "An MIT Scientist Claims That This Pill Is the Fountain of Youth". The Cut. New York. Retrieved 3 November 2019.
Leonard Guarente is certain he’s succeeded where doctors (and quacks) before him have failed. His pill will either extend lives or tarnish his career.
- Wiener, Norbert (1954). "Men, Machines, and the World About". In Wardrip-Fruin; Montfort (eds.). NMR (PDF). p. 65–72.