Tuesday, January 29, 2013

Early Thoughts on the Chromebook

Like many other districts and schools across the nation, we live in a state (CO) that has chosen PARCC as an assessment tool for CCSS. As we've been exploring what kinds of hardware we need to have in place for PARCC, we decided to test drive a few Chromebooks to see if those would fit not only our assessment needs but our instructional needs as well.

Thanks to Promevo (they've been great to work with), we were able to get some loaner Samsung models for a couple of weeks, and we were able to purchase a couple of Acer models.  Here's some things we've found in the short time we've had them.  No surprise -- this one goes to 11.

  1. Great for web-based workflow.  It's amazing how much you can do on a Chromebook.  I've been using one exclusively since I got it, and I am impressed.  Disclosure:  I'm a heavy GoogleApps user, and I rely on web-based applications for almost everything (even movie editing with WeVideo). I was   emailed PowerPoints and Word documents from Office users, and the browser displayed them easily.  Obviously I couldn't edit them, but I could have converted them into GoogleApps versions, if needed.  I've been able use it for about 90% of my workload.
  2. Not great for Java-based or Shockwave content.  What about the other 10%?  Unfortunately for us, we still rely on Java for things like our gradebook and some of our educational content.  And, we use Everyday Math at our K-5 level, which relies upon Macromedia's Shockwave Player.  However, we use computer labs or laptop carts for students now, and there's no reason why we couldn't continue to use them for specialized things. Hopefully more and more vendors are looking at making the switch to HTML5, which would solve many problems for us.
  3. Display hinge seems a bit flimsy (Samsung).  I'm a little worried about how much wear and tear this part of the machine could take, especially when used by students.  I saw some blog posts from other schools using the Samsung, and they brought up the hinge as a problem they've experienced (even with careful students and cases).  However, the Acer seemed like it had a sturdier component for the hinge.
  4. It's easy to use.  The overall user experience is comfortable, very quick, and user-friendly.  It starts up and shuts down in seconds, which is pretty key if you're in a classroom with antsy students (or adults).  We constantly battle kids shutting lids before a machine completely shuts down, and the Chromebook would definitely cut down on that problem.
  5. Battery life is pretty decent.  Battery life on the Samsung was very good (on both models I tried).  The Acer rang in at about 4 hours for me, so the Samsung felt like it had more stamina (I got 6.25 hours out of the Samsung 330).  The power supply on the Acer is oddly shaped and is a problem when try to plug it into a power strip.  The Samsung didn't have that issue, though.
  6. Projector mirroring not available on Samsung 330 .  The Acer has a VGA port, and it worked as expected when connected to a projector.  The Samsung 330 has an HDMI port, but when I hooked it into our wall panel, it would not mirror.  Either the image was on the Chromebook display or it was on the projector -- not both.  I did some searching, and this seems to be a reported issue with the 330. Maybe not a deal killer for a kid machine, but it would be nice if mirroring was not dependent upon model.
  7. Wireless access isn't WPA2 Enterprise friendly.  We've had trouble with iOS devices on this front as well, but I had trouble with the Chromebook (at least the ones I've tried) with our WPA2 Enterprise wireless networks.  It connected very easily to our other networks and was a cinch to use in various locations.  
  8. Lightweight and easy to carry.  The Samsung in particular is a really nice looking machine and is surprisingly light.  It's easy to carry, but the Macbook Air-esque finish also makes it a bit slippery.  The Acer isn't that much heavier, but isn't as thin as the Samsung.  I haven't gotten to see the rugged Lenovo version, but that looks heavier (but tougher).  Either way, it was a nice change from the HP laptop I've been lugging around.
  9. Couldn't find a way to screencast.  This could be user error on my part, but I couldn't find an app for screencasting.  I had no problems taking stills or using the webcam for a Hangout, but I didn't see how to actually capture what I was doing on the device.  I suppose I could use Chrome's Remote Desktop from a machine with capturing capabilities, so there are workarounds; however, I hope that feature (or app) becomes available.
  10. USB devices seemed to work fine.  I didn't have a chance to do extensive testing here, but I tried thumb drives, mice, keyboards, and USB headsets.  Everything worked without a hitch.  I didn't have a USB document camera to test, but the usual USB suspects played nicely.
  11. Management through the GoogleApps control panel is great.  We got two licenses, just to test it out, and I was pretty impressed with what you can control, both at the user and at the device level.  They'll need to figure out a way to turn off the camera and Bluetooth for the Samsungs to be PARCC compliant, but it looked to me like you could set policy and have things running smoothly with little technical background (unlike AD, which does require a bit of expertise when it comes to policy).  The only snag I saw was that it could take 24 hours to propagate changes to managed devices.  You'd have to plan ahead. . . .
I think these have a lot of potential, especially at the price point.  I'm not sure we'll be able to adopt them widely at this point since our state's online tests for both science & social studies require Java, but if Java was removed from the equation, I'd have a hard time arguing against them.  They work well for the majority of things our students do on a daily basis for learning (web research, multimedia creation using web-based tools, GoogleApps for writing, web apps for math & science, etc.).  We won't be able to get away from computer labs for high end use (at least not in the near future), but Chromebooks would be a great way to increase access without seriously depleting a budget.

Monday, January 21, 2013

Text Complexity on the Web: 5+ Tools for Quantitative Evaluation of Online Text

3 Factors for measuring text
complextity
(taken from corestandards.org)
This is cross-posted on TeachThought.com, where I wrote guest blog post. 

I posted back in 2011 B.C.C. (Before Common Core) about finding accessible online text, but a recent blog post from Eye on Education (How to Select Complex Text to Increase Rigor) made me think about revisiting the topic.  My original post was more about finding reading passages for differentiation purposes, but the Common Core's approach to measuring text complexity has now elevated that need to a whole new level.  This post specifically addresses one aspect of text complexity -- what the Common Core terms "quantitative evaluation."  Before delving in, though, it's important to note from the onset that other measures must be in place to adequately explore complexity.

Currently, there are many web-based tools that help with the quantitative evaluation of books and even textbook content (for example, you can use Barnes and Noble to search by Lexile measure); however, as our students will likely be reading a combination of print and digital materials (especially in states giving the PARCC test), tools that help identify scales for online or digital text are also necessary. Here are five (mostly free) web-based tools that might be helpful as we curate reading content for students.


1.  Online Databases.  These should probably be at the top of your list when when looking for online text.  Many schools, districts, and public libraries across the country pay subscription fees for online database collections like EBSCO and GALE.  These are mostly free tools for students and teachers -- they are paid subscriptions, but the costs are typically covered elsewhere. Included databases in those services vary depending upon subscription, but check the search options for either Lexile number or Lexile range.  (The image on the right is from one of our high school's EBSCO database searches.)  You can check the Lexile website for a list of database providers that include Lexile information as part of their service.

Even if your school or district doesn't pay for these types of databases, chances are good that your public library does.  And if you are lucky enough to have a certified librarian in your school, be sure to befriend him/her.  They are incredible resources for finding grade-appropriate material and assisting with anything related to information literacy. 


Taken from Russel's SearchResearch Blog
2.  GoogleSearch by Reading Level.  This is a decent starting point if you're using Google's search engine.  In any GoogleSearch, you can go into the "Advanced" search options and choose to filter by basic, intermediate, and/or advanced reading level.  Daniel Russel's blog post about this feature explains how they designed this filter: "We paid teachers to classify pages for different reading levels, and then took their classifications to build a model of the intrinsic complexity of the text. . . We also used data from Google Scholar, since most of the articles in Scholar are considered advanced."  

In Google's classification, "basic" equates to an elementary level while "intermediate" would apply more to the secondary or 6-12 grade level range. Advanced would indicate scholarly or post-secondary text.  Because these ranges are so broad though, it might help to start by limiting a search to either "basic" or "intermediate" and then use a tool below to gather more detailed information.

3.  JuicyStudio. If you have a URL and you'd like to check its readability scale, you can paste the URL into this website's readability test.  It will run the page through its algorithm to figure out the reading level. This free service also tabulates how many words & sentences are in the page, as well as counting how many words have 1, 2, 3, or 4 syllables. There are explanations for what the different reading indexes reveal.  Lexile numbers aren't specifically identified, but
other indexes are used (including Gunning-Fogg and Flesch-Kincaid).



4.  EditCentral.  Like JuicyStudio, EditCentral is a free tool that runs text through an algorithm for various readability indexes.  Instead of pasting in a URL, though, this site allows you to paste in text (up to 50,000 characters).  This one also doesn't provide Lexile information, but it does color code the results of the different reading scales, and it also underlines words that might be considered complex or difficult.  That is usually determined by number of syllables, but it could serve as a good way to anticipate words that may increase the level of difficulty.


5.  StoryToolz.  I came across this free tool, thanks to a post on the ESL Trail Blog.  Akin to EditCentral, you can paste in text (up to 5K without a login, up to 50K with a login), and it will generate several reports.  It uses similar indexes (not Lexile) for determining readability scores, but this site generates additional reports that could aid in writing instruction.  The "Word Usage" report gives statistics on items like "to be" verbs and prepositions while the "Sentence Beginnings" report identifies how many times different parts of speech start a sentence.

In addition to the tools listed above, you can also access the Lexile Analyzer.  If you create an account, you can upload a .txt document of up to 1000 words for free analysis.  Educators can request access to the professional version for longer documents.  Certain formatting and steps are required prior to upload (details are on the website).

I'm sure more tools are out there, but these are the web-based tools I've found that may help with quantitative evaluation, and you can even use a non web-based program like Microsoft Word to give you a basic Flesch-Kincaid Grade Level.  As mentioned at the beginning of this post, it’s important to keep in mind that this is only one facet of a reading selection and should never be used as the sole basis for determining complexity. It is, however, a good place to start, especially as we try to discover diverse, timely, and relevant web content for our learners.