“And You Learned That Where?” (Wiki’s fine for a bar-bet, but for a research paper? Not that much)

You’re doing that research/persuasive/white paper, and you need some verifiable facts to make your case. So you Google/Bing/Whatever-the-engine, and likely the first or second entry you’ll see will be a Wiki (Wikipedia) entry. It’s likely full of great stuff. Easy-peasy, yes? Nope. The problem in using information from a Wiki or Wiki-type listing is simple: Anyone can edit it. From experts with a couple of PhDs and years of experience, to a 10 year old working on his tablet. Usually Wiki entries are reviewed by experts and often contain incredibly useful information, but people reading your paper have no idea if that particular source was correct that day. There are ways of utilizing Wikipedia, but it takes an extra step, and we’ll talk about that in a few moments. In the meantime, some sources you should not use for scholarly works (that is, anything needing to be verified):

  • Wikipedia or other “Wiki” pages – As explained above, anyone has access.
  • Blogs – Unless the blog (or guest blogger) is a writer with verifiable credentials, we have similar issues as Wiki above. In fact, many high school and college instructors won’t allow blogs as research sources regardless.
  • About.com – This is not a shot at About.com. But, their answers are often gained by websearches, thus not allowing for verification in some cases, and in other cases, not citing their answers at all.

Sources you should be concentrating on for more uniformly creditable information include:

  • Peer Reviewed Journals/Publications – When an organization has its own recognized (scholastically) experts, they will often have a regular journal or magazine that is what is referred to as peer reviewed (sometimes “juried”). These are made up of articles that are reviewed by an editorial panel of those recognized experts. Some larger examples of this include JAMA (The Journal of the American Medical Association) and the Journal of Sports Science and Medicine. 
  • University published or sponsored papers – This one is not a slam dunk automatic good source, but if the source site is educational (usually designated by ending with .edu), chances are excellent that it is a vetted and proper source to cite.

Then there are the “gray areas” – sites like History.com, and many nonprofit agencies (.org) are often considered to be reliable sources. These are the ones you may need to ask about. Even if they have biases, they are likely to provide data that is verifiable. But the easiest way to ensure compliance in research sources is to use a college (and sometimes high school) library’s databases to search. Most of those search engines have a filter to only show peer reviewed journals, if you decide to use it.

And finally, the how-to-use-Wiki tip I promised up there: In the more reliable Wiki articles, you will often find their facts annotated and cited. That is, the Wiki author gives where he/she found their info. Simply copy down the site source and go to it and use it yourself.

Dragged Kicking and Screaming into Tweetsville

Well, it was only a matter of time. I had avoided tweeting for the first nine years of its existence. Technically, I STILL have avoided it, as I have yet to tweet my first tweet (tweet my first twit? I get confused about social media conjugation), but it is now simply a matter of time. See, it’s an assignment. I have a graduate class in Collaboration in a Virtual Environment and one of the collaboration-in-a-virtual-environment tools we are to explore is, yes, twitter.

I finally succumbed to Pinterest, because it actually gives me a platform to organize many of my interests and sites/postings in a more Larry-Friendly manner than, say, bookmarks. Yes, to call me “Old School” is pretty much akin to calling the aircraft carrier Enterprise a rowboat. But this Twitter thing… I’m still trying to figure out the subtleties.. like why it exists.

I am doing my due diligence, and truth to tell, there are some happy advantages that I think I can use in my education career going forward. It can pass on quick announcements to my students, it can pass along useful notes, links, and research materials for both assignments I give and projects I research myself. And even put me in contact with people that I can learn from and network with. Looks like I might wind up being a social creature after all. Despite my best efforts.

WebQuest – A Really Big Thing in Internet Research, or Smoke and Mirrors?

Have you heard of this thing? Apparently, WebQuest has been around forever (in Internet-Years). In fact, according to WebQuest.org, the original model for this was developed by Bernie Dodge at San Diego State University in early
1995. Personally, I’ve only been exposed to it in the last few months through a graduate-level course in Teaching with Technology.

While it’s been a useful tool in setting up lesson plans and even in larger curricula development, I’m thinking I might be missing some of the nuances of this, considering some of the “best thing since sliced bread” comments I’ve run across. There are quite a number of sites around that share WebQuest Lesson Plans, as well as providing templates for creating them. Two of the larger environments for them seem to be WebQuest.org itself, and Zunal.com. Zunal seems to be a more plug-and-play-friendly template home, whereas WebQuest talks more about what it is and how it was developed.

WebQuests are designed to enable a pretty deep critical thinking level of learning. The categories each WebQuest entails are (in order):

Welcome: A brief overview: What it’s about, who it’s for (grade level), and what discipline.

Introduction: Setting, background, overview of the assignment.

Task: The heart of the matter – What you want to see accomplished.

Process: How will your students complete the above task?

Evaluation: Pretty self-explanatory. How will you grade it, what are your rubrics for evaluating the work?

Conclusion: To wrap up for the student. What did they learn? What do they now WANT to learn?

Teacher’s Page: This one is to list any information any fellow educator who uses your WebQuest might want to know: Additional resources, credits, observations, etc.

Finally, WebQuest does in fact seem a useful tool, especially in the area of sharing standardized lesson plans and curricula with peers, and not having to reinvent the wheel when there are literally thousands of already completed WebQuests out there for sharing use. I’m still not sure it’s in that sliced bread category though.

The Authorship Agnostic: So You’re REALLY Sure Who Shakespeare Was, Eh?

First of all, a very important disclaimer: I am NOT, nor have I ever been, a member of any Anti-Stratfordian cult, group, organization, or sewing circle, including but not limited to a proponent of Oxford, Bacon, Neville, Marlowe, et al. (although I am historically, a fan of anything that promotes bacon the foodgroup). In the interest of full disclosure though, and the purpose of this post, I am also not strictly a Stratfordian (those that are convinced that Shakespeare was in fact written by the guy with the really big forehead and bad haircut). I’m a Shakespearean Agnostic. I’m hoping to make a few of the reasons clear below.

What perplexes me is the vehemence of the arguments on both sides. Stratfordians (the vast majority of at least the vociferous critics) laugh at, scoff at, and more usually abuse, those that proffer their reasons that Edward de Vere, 17th Earl of Oxford, or Francis Bacon, or Christopher Marlowe or others was in fact the “one true author of Shakespeare’s writing”. I have certainly not read all of, or even the majority of, either side’s manifestos. The articles, books, papers, and lectures are in the thousands, if not tens of thousands. I have in fact seen arguments made for no less than 10 different people being the author(s). As well as dozens of refutations from Stratfordians.

But the key here is this: The refutations consist mostly of reasons why the Anti-Stratfordians’ arguments may not hold water (the bias against a ‘commoner’ knowing enough to write that well is just a bias; the lack of verifiable documents of/by Shakespeare himself is circumstantial, etc.), rather than any actual factual historical data. I’ve seen refutations that say we have no more “evidence” that Marlowe wasn’t Marlowe, or that Ben Jonson didn’t write what he was credited with, and so on. Okay, I’m personally also fine with that assumption. Not only is it impossible to prove a negative, the lack of genuinely primary resources can also fail to prove a positive.

As a fundamental example of a “refutation”, Tom Reedy and David Kathman wrote a well-shared article called “How We Know That Shakespeare Wrote Shakespeare: The Historical Facts”. In it, the authors cite things like: “The name William Shakespeare Appears on the Plays and Poems”. Well, if you assume ANY sort of intrigue, including that at the time, plays were not considered literature (a point that the authors themselves make in the same paragraph), and that a noble would not want his name associated with it, having a “mere actor” in a company the author of a collection of plays is hardly evidence of true authorship. In fact, the first six plays published in quarto were initially published without an author, and the first four of them NEVER had an original author name published. By Reedy and Kathman’s logic, does that mean that’s evidence that Shakespeare didn’t write those?

Two more pieces of “evidence” cited were the facts that we know that William Shakespeare was an actor, and that he lived in Stratford-Upon-Avon. Well, okay, we do in fact have some evidence of those facts. Neither of which sheds light on actual authorship.

Again, please remember, I am not advocating that I know or believe that the man we know as Shakespeare didn’t write Shakespeare. Nor am I advocating anyone else’s name above anyone else’s for that honor. I do believe that some day we may have more significant historical evidence of who wrote it (hell, we found Richard III in a carpark in Leicester, England). I’m simply very content to go on reading, acting, directing, and loving the words. Whether they are Shakespeare’s, Oxford’s, Bacon’s, Marlowe’s, or anyone else’s, they are quite probably the best words, collectively, ever written in the English language. And quite possibly, any other language.

Incrimination Recriminations: Amend the Fifth, Don’t Plead it

First, what will become one variation of my mantra of choice: I’m not a lawyer, nor do I play one on TV. While opinions here are in fact the opinions of management, I’m a really small company. Your mileage may vary.

“…(N)or shall (any person) be compelled in any criminal case to be a witness against himself…”

Fifth Amendment to the United States Constitution.

A pretty simple sentence, that has been the cornerstone of a lot of criminal trials and some significant legislation, most famously in “Miranda v. Arizona, 384 U.S. 436 (1966)”, which says that upon arrest, you have the right to be told that anything you say can and will be used against you, and that you are entitled to legal representation before speaking. That got me to thinking: We’ve all seen accounts of, or heard of, drug busts, organized crime arrests, white and blue-collar crime alike, that have used wiretaps or other voice-gathering equipment to secure convictions of those arrested.

It’s probably safe to say that virtually none of those suspects had waived their fifth amendment rights, and, even when those wiretaps were legally authorized, all of the suspects had reasonable expectation of privacy (their own home, office, etc.). Can a judge authorize a warrant voiding a suspect’s Miranda rights (and no, I’m not referring to the Patriot Act, that’s a whole nother kettle of fish)? And yet, when a legally admissible wiretap shows the suspect agreeing to hire a hitman, or buy/sell drugs, or embezzle, said suspect is in fact incriminating him or herself. The right to avoid self-incrimination is pretty useless, until after the suspect is arrested and in custody. A whole branch of investigative criminology is based on trapping the suspect, or making him incriminate himself, or otherwise being a witness against himself, if only by violations of that expectation of privacy.

Along with the hypocrisy of this, we also treat the client/lawyer privilege differently as well. If the prosecutor presents a taped conversation of the accused confessing to his lawyer, he would be laughed out of court, if not disbarred. If he presents a taped conversation (with warrant), of the same confession to his private secretary, it is likely admissible – even when he had that same expectation of privacy.

A really important distinction for me in this though: I DO think that justice is justice, and people doing bad things should get locked up. I simply believe we shouldn’t pretend we’re providing “protections” that we aren’t. And, more importantly perhaps, that we should NOT be providing. I’ve never quite understood why we don’t treat confessions the way we treat wiretapped evidence: As long as it’s obtained legally (i.e. without the notoriously unreliably and illegal torture), there should be no protection against self-incrimination. Don’t want to get caught? Don’t do it. Amend the Fifth, get rid of the self-incrimination clause, and treat evidence as evidence.

Be Careful, Your Children Might Just Get the Education You’re Enabling


No profit grows where no pleasure is ta’en – In brief, sir, study what you most affect.                          

William Shakespeare

One of the drawbacks to being an older adult who has returned to work in Educational and Curriculum Design as well as Teaching after having first gone to school during the time between the Kennedy and Carter administrations  is that my first reaction to the state of the current educational system tends to sound like it comes from the guy that screams at your kids to stay off of his lawn all summer long. You know the type. The one that begins every third sentence with “When was in school… yadda yadda yadda”.

Oddly enough, with the money getting tighter and the technology branching out, we have two entirely different things going on: More kids are dropping out, and the ones that are not, are competing harder than ever to get into the best colleges. The problem with the former group is obvious. But the latter group is becoming an increasing problem “out in the world” as well. From the abomination that is “No Child Left Behind” (more about NCLB is a post for another day) to Common Core (also all kinds of problems, but then again, babysteps), the problem in trying to figure out who gets the limited finances available at the state level was dealt with by standardizing not only the tests, but the expectations.

On the surface, it’s a tidy little solution: We now have “STANDARDS”. We test the students, we match the matrix, we dole out money. Never mind a few little complications like:

  • We are giving the same lesson designed for an at-risk teen in Harlem or East Compton as we are for an affluent resident of Central Park West in New York, or a student who grew up on a farm in Ottumwa, Iowa . A noble idea, but with a little flaw: These three groups do not have the same readiness or support system. They are culturally as different as can be, and the same teaching techniques, curricula, and learning objectives will frustrate one group and bore another.
  • We teach ONLY these test answers, so that learning doesn’t in fact take place, just rote memorization.
  • We restrict money to schools (and thus to teachers, administrators and school programs) based on these scores, so schools have even taken to cheating, or at the very least, to contemplate cutting corners.

But we have so much more available to us in the way of options these days. We have the ability to Differentiate how we teach our students. Simply put, that means we figure out how to increase the number of ways we teach, the number of ways we assess, and the number of ways we challenge the next generation. New technology tumbles out every week from all over the world, not just Silicon Valley: new research is done every year about how we learn as a species; and new social media evolves to tie both together.

In the coming days and weeks, we’ll look at some of these alternatives, their advantages and disadvantages, and things we can do. One student, one parent, one educator at a time.