Discourses

I’m helping organize a panel of faculty at my school who have been using a range of different technologies to support student interaction in and out of class. With so many options out there, we want to focus on what has worked for us, what hasn’t worked, and start some conversations around how to make the jump from looking at your course, with its content, outcomes, and pedagogy, and draw on others experience with these tools on the ground with our systems and our students to have some idea what options might be appropriate. Independently of this, I’ve got a group of students in a capstone using Basecamp to organize their project, and it’s got me thinking about (not a very new thought, I’ll admit), whether I like the idea of using this type of professionally-oriented project site in other courses to have students manage their groups. I get a lot less control than in a CMS, but the flip side is I’m running into fewer places where students are trying to make the site work for them and they don’t have enough power. SIGCSE just had a discussion in its mailing list about what repository systems people are using, and in all levels of courses. And when I see articles about new tools for collaboration and discussion like the new Discourse discussion platform, I’m immediately thinking about whether the improvements they talk about (less pagination, more dynamic processes for replying, flexible content embedding, and moderation/ranking tools) would work well for class discussion also. It reminds me of the window of time when I was in school and it was normal, if not expected, for a department to have a set of forums/groups associated with, and not just on a course by course basis. Is that still out there and I’m simply at an outlier school without them, or has that type of conversation been killed off? Is it (*shudder*) on Facebook?

Sometimes you can blame the compiler. Sort of.

I don’t know if this weblog entry about bug hunting in large scale game development is more appropriate for my spring games course or my spring project management course. The stories are great for both directions. Team members with poorly defined roles! Frantic timelines leading to bugs! The reality of entire days lost to a bug that won’t be found, let alone fixed! Bugs explained in simple code a novice student can understand! I particularly enjoyed the explanation of why the live server compiler ran without debug capabilities, violating the ideal that the dev and live servers are identically configured, to ensure that debug features used to test the games by forcing benefits, monster spawn, etc. weren’t leaked into the live system – if there were never sensible reasons for breaking what seem like obvious rules, they wouldn’t get broken.

And there is an interesting lesson about customer satisfaction in the story of embedding code in a game to identify hardware failures and then not only detecting when bug reports are actually related to customer hardware failures but proactively telling customers when they are having hardware issues before problems crop up so they can do something about it. The explanation of why this ultimately saves them time by avoiding hard to resolve bugs that are really due to hardware faults makes sense, but also reflects an interesting decision about how much responsibility to take for the entire game playing experience, whether portions of that are actually one’s responsibility or not.

Risks in user content

My security class is talking about the types of commonly seen mistakes that can crop up when writing programs that lead to security flaws, and while I usually introduce the ideas using “normal” programming examples because it is the common background I can assume my students have, I’m trying to help the students map these ideas to what they’ve seen of database or web development as well. So I finally went back in my saved links and read through a Google blog post from a month ago about security issues in hosting user content, specifically web content.

After a brief but reasonably nice survey of the problem they’re trying to address, they include this interesting statement, contrasting the current state of affairs to the old days of hosting static HTML: “For a while, we focused on content sanitization as a possible workaround – but in many cases, we found it to be insufficient. For example, Aleksandr Dobkin managed to construct a purely alphanumeric Flash applet, and in our internal work the Google security team created images that can be forced to include a particular plaintext string in their body, after being scrubbed and recoded in a deterministic way.”

I’ve been trying to make the argument to our digital media (as I get opportunities to talk to them) that they really ought to think of security as a good elective to round out their major, particularly those focusing on courses in web development and mobile application development. I’m sorely tempted to print out a copy of this article and go over and paste it on their lab door – or at least remind their professor to do another advising push towards the course on its next offering. These problems are perhaps outside the scope of what most web developers would encounter, but I wonder if rejecting the importance of understanding these issues would be analogous to an application developer believing that only someone working in operating system design really has to understand security.

What are you getting credit for?

A colleague sent me an article about a U.S. university accepting transfer credit for a Udacity course – something described in the headline and the first few paragraphs as being a breakthrough in a school accepting a free, online course for full transfer credit.

The article gets interested when you dig into it though. The course in question is a intro level “Introduction to Computer Science” course. And, in order to get the transfer credit, students have to not only get a certificate of completion from Udacity showing that they completed the course, but also pass an exam administered at a testing center, for a cost of $89. Which, happens to be the exact same price as taking the AP CS exam, which you can take, by the way, even if you aren’t signed up for an AP class (with, I think, some hoops to jump through). And, I’m going to bet that the transfer credit received is the same transfer credit students would get if they took the AP exam and did well on it.

So, the more accurate portrayal of the story is, I think, that there is a university that has decided that they will now allows students to get transfer credit either for taking an AP exam or for taking one of these tests run by Pearson VUE (which, when you look into it, is a massive testing operation already). And, the university accepting the transfer credit is applying it only for their fully online program, not their programs with physical campuses.

Moving into the realm of wild speculation now, even if this practice becomes widespread, it seems like it will turn into more of a threat for the AP/CollegeBoard than for colleges which already allow students to get transfer credit for some introductory courses. With there still being fees and the need to show up at a testing center, I can see this broadening access to college-level placement tests. But (again, wild guessing), it seems like colleges will have less to lose with some number of freshmen who would not otherwise come in with transfer credits now having a handful. Whereas, as a high school student highly focused on your GPA, you have an interesting choice between whether you take an AP course which may get counted as more important on your transcript but may also result in a lower grade, or do you take an easier course, get a better grade, and then take an online course to supplement and then take an Udacity exam for college credit? Factor in that the AP is a one-shot deal, and it looks like you can retake the Udacity courses (more like the SATs) and there’s an interesting question of which is more appealing.

Robots run amok

Interesting story of the life webcast of the Hugo Awards being blocked by copyright enforcement bots. Short version: the live webcast included clips of the television episodes up for best script (as award ceremonies do) and UStream’s bots for detecting copyrighted work spotted it and blocked the entire rest of the broadcast. The article points out that not only is that fair use but, the clips were provided by the copyright holders who were happy the content was being promoted as award winning.

The whole thing is reminiscent of NASA’s footage of the Curiosity landing being removed from NASA’s YouTube channel under the claim that it violated Scripps News Service’s copyright on the material. The problem being that Scripps uploaded NASA’s video to their own stream and, accidentally they say, marked it as being their own content. It ought to jump out at you that, whether Scripps made an honest mistake here or not, there’s plenty of potential for someone to fraudulently claim ownership of content and harass the legitimate owner or reap profits from the content with so little evidence required. Figuring out that a live feed of a NASA rover and the NASA control room during a highly publicized NASA mission actually does belong to NASA has to be one of the easier cases to get right…

The common thread being the automatic disabling or removal of content without solid evidence that infringement is happening or, clearly, human review. It also sounds like, from the Hugo Awards case, there isn’t anybody standing by on call to reverse these actions if errors are made. Add this to the list of things to worry about with both digital intellectual property management and what happens when you start moving to the cloud.

Let’s read some books!

An article on how reading is important for leadership feels appropriate for the start of the semester, particularly with it’s mention in the second paragraph of the difference between literacy and the ability for deep reading.

A fun exercise is applying a bit of that “deep reading” to this article. You’ll probably notice that there’s lots of fine anecdotal accounts of great leaders also being great readers. When the evidence starts coming out, things get shakier. The supporting link for the claim that reading offers the best stress-reduction is to a newspaper article about the study that doesn’t make clear that they tested books versus reading web content and also reveals that the study was part of a chocolate marketing campaign. The supporting references for the links between reading and increased verbal skills are more solid but also do not seems to be specifically about the advantages of reading books. In fact, one of the cited papers breaks down the rate of rare words in various types of text and finds that books rank fourth below scientific articles, newspapers, and popular magazines. While this is all compared in that article to spoken language to explain why simply watching television or engaging in conversation doesn’t have the same effects, there is no breakdown of where web-based reading falls.

But in the end I also agree with the point of the article. I’m prepared to believe there are different effects from web versus book reading, probably based on issues of attention/distraction and the goals of the text, and recognizing that one would have to actually define what they mean by “web reading”. And, I do hope all my students coming back next week are prepared to do a lot of book reading!

Brick-and-mortar college

For some reason, this article about Best Buy as a showroom (from earlier this year but I only just read it) made me think about the conversation going on about MOOCs. I had to ask myself if my willingness to turn to online stores for a deal rather than spending more for a more robust experience (being able to try out products, get advice, have it immediately) revealed my feeling that small, in-person classes are still worth the money hypocritical or, at least, motivated by self-interest.

So I thought about what I do buy in-person, rather than online. Clothes, obviously – once I was out of grad school and had a bit more money that was the first thing I stopped shopping for online, being willing to pay more to get clothes that fit better. But in general, it’s items where I am picky about the features, particularly usability features. I could use a new computer bag, but no matter how many descriptions and pictures there are, I can’t pull the trigger on ordering one when I can’t try out the pockets myself (having a smartphone slightly larger than an iPhone means that you’ve got to test that “phone pocket” really does mean generic phone pocket and is not just shorthand for “iPhone pocket”). Any housewares where I really care what the color is – I’ll go to Target for my new dishclothes because I want to make sure the red matches instead of clashing with my toaster and trivets and such. And, items where I’m not sure I know enough about the product to buy it based just on text and images versus seeing it in person – during a toilet-rebuilding project this summer, browsing the Home Depot site on my laptop next to the toilet to figure out which part I would need to buy was way more frustrating than just showing up at the store with a crumbling gasket and looking for the one that actually matched.

So it seems like I want to go to a store when the physical form of what I am buying matters – either to ensure that it fits (my body, or my toilet), or to ensure that its appearance and usability meet my needs (having effectively placed pockets, or being visually consistent with my decor). Stores that make this easy encourage me to buy from them rather than online. And, at least for me, Best Buy doesn’t sell products where this is a factor. They would need to provide something else. When I think about what is challenging about buying electronics online, it is the risk that when you get it home, it won’t do what you want. The printer drivers won’t play nice with your wireless network, an adapter or cable is needed you didn’t know to order, or the minimum memory claim on the software assumed you didn’t mind getting a cup of coffee between commands. At least as much of my time exploring products is spent defensively – after I have decided the features, I want and selected a product at a good price point, trying to convince myself it will actually work as described in my particular setting. Finding the sweet spot in answering that need in a cost-effective manner (what I think Geek Squad was intended to do, and what some of my Mac-owning relatives think Apple stores do well) may be what is required.

And what does this tell me about my perceived value of in-person education? Well, it sounds like a similar situation. We can educate ourselves using MOOCs and other free resources (libraries ftw!), but how much additional time and effort is required to figure out how to educate yourself, what an education actually is, and determining if you are actually getting educated, versus spending time becoming educated, if your goal is the equivalent level of education? And, is it worth your time to have someone else have solved those meta-level problems for you, freeing your energies up just for the task of doing the learning?

Rethinking courses

I recommend both this article about plagiarism in Coursera’s courses but also the comment thread, which makes a few interesting connections between the specific issue of plagiarism happening in these courses, and the broader discussion about MOOCs and their role in the higher education universe.

The obvious question, posed but not answered in any of this, is why would students plagiarize in a free, non-credit course that they are taking entirely voluntarily? If you want to just watch the lectures, or just do the reading, there’s absolutely nothing in the structure of these courses to prevent it. And, completing the course gets you nothing more than a confirmation email with a lovely PDF “certificate” attached. My suspicion is that it isn’t that far off from why people cheat at games, even solitary, non-social games. As one comment puts it, we want to feel good about ourselves. Having signed up for a course, giving up on it because writing a paper is hard feels like failure, whereas plagiarizing is easier and makes us feel like we succeeded.

But then the broader questions come in. The article points out the vast cultural diversity within the courses and focuses on the “teachable moment”. Instructors in these courses are going to have to be much more explicit about underlying assumptions about how academic work should be done, and instruction on things like plagiarism and academic integrity probably needs to be integrated into every course, because every course may be the first one a student is taking. At heart, this seems to tie back to the lack of a prerequisite structure. At a traditional college, I can assume you took freshman comp and learned something about plagiarism. In Coursera, as it is set up so far, that’s not a valid assumption. For computer science or math courses, the prerequisite issue is more one of if the student is prepared, but if you sign up for cryptography ignoring the statement that you need to have some statistics background, the issue will quickly self-correct as you realize that you can’t understand the lectures let alone do the assignments. In a literature course, it is easier to muddle along and convince yourself you’re getting the idea, even if you don’t have the requisite background.

While it wasn’t the main topic of the article, this was also the first time I read anything about how essays are graded in these courses (which – they are peer-graded with, it sounds like, no instructor involvement). It immediately makes sense from a cost-savings perspective. And, I can see how it makes sense from a “wisdom of the crowd” perspective. Plus, peer review is a valuable exercise even in traditional classes – there’s a lot to be gained by reading and reflecting on another students’ work. But, the wisdom of the crowd only works if you either have the entire crowd react to each piece, or if you are only interested in the assessments being reasonable on a crowd-level (with high tolerance for a handful of inaccurate outliers). Both are problems here: you aren’t going to get each student to review the hundreds of other essays to get a crowd response to each work, and if you’re the student who gets an outlier review, that radically decreases the value of the course for you (and, perhaps, reduces your motivation to work hard, and avoid plagiarism…).

Finally, I found one comment particularly interesting (from “husky1”): “I am curious, can students cheat when there is no grading standard? And, why would you have grading standards in a non-credit course? Feedback on the material submitted by the student is one thing. If you really believe MOOCs are a poitive thing then I am not sure “plagerisim” should be a concern. I suspect even someone who cuts and pastes material from say Wikapedia is actually learning something about the topic they are addressing. And isn’t the point here for each person to get what they want out of participating in the course?”

What jumps out at me is that this comment assumes that plagiarism is a concept that is only relevant in the world of graded assignments – that it is a construct of academic assessment, not a real-world principle to be followed. More generally, it highlights the degree to which we’re still figuring out what these MOOCs are. Are they just about putting information out there to help people who want to learn? Ought they hold students to some level of academic standard? In what sense are they a “course”? I see a tension between people signing up to “get what they want” out of the course, and the idea of instructor-created learning outcomes for a course. If there is a fact you want to know, or a skill you need to develop, you can read a book, watch a video, or participate in an online “course” that will help walk you through the gaining of that particular collection of factual knowledge or well-defined skill. But, to my mind, a course is a more completely defined experience, and part of what you are getting is the determination by someone with more expertise than you as to what one should get out of the course. I am confident that the majority, if not all, of the content covered in these courses is available through other sources (books, articles, etc.). There is value in having someone with expertise filter the content, present it in a more appealing video format, and wrap it in explanation and context – that is certainly part of what makes a “course”. But if you lose the idea that the course has been designed to achieve some set of goals established by the instructor, and instead allow that any goal that a student might have in participating must be equally valid and supported, I think you may have lost the “course-ness” of of the course.

This entry readable in lynx 2.8.3 or higher.

This security critique of the Tesco website is a hoot. It walks through an increasingly deep, and increasing damning, look at what is wrong with their setup, and how you can tell. The critique is well peppered with links to additional content about the problems being described, so it’s not a bad starting place to learn something about web security. It is also an accessible illustration of the type of exploration and deduction that can be used to profile a system and its vulnerability. Finally, to me, it reads as a nice lesson in why you can’t just “throw some security on your site” without real expertise. I like the concept of “unconscious incompetence” being used to describe the situation where incompetence (here about security) is being compounded by a lack of awareness of the incompetence. If you at least know what you don’t know, you’re moving a step in the right direction!

Readability versus Realism

I’ve been reading a lot about games and game design over the past few months, and this recent blog post about when visual detail in games becomes overwhelming rung true for me. It’s responding to the difficulty that can emerge when trying to actually play the stunning, complex, 3D games that are coming out, when compared to less graphically “sophisticated” games. The idea of readability, and what makes a game readable, is nicely discussed. I liked the idea, hinted at, that if your game is only playable because you have added meta-labels that appear when you’ve successfully found or targeted an object, this may be a sign your games visuals are sacrificing readability for complexity.