Measures of Learning Effectiveness Category

Wednesday, June 8th, 2011

Competency Frameworks: A First Step

Listen with webreader

I’ve been seeing the phrase “competency frameworks” a lot lately.  I’m glad.  I’ve long been concerned about the disconnect between training content and job performance.  In a quarter decade of business training, I have rarely felt much attention was given to the question: “What do these people need to know to do better in their jobs?”  I often felt that training was designed from the starting point of “here’s what I know so that’s what I’ll teach”.

So, what is a competency framework and how will it improve the effectiveness of training?

Ratings of Exceptional, Exceeds Expectations, Meets Expectations, etc.The HR Dictionary defines competency framework as “the set of duties or tasks performed as part of a job with the standards which should be achieved in these duties”.

OK, so for every training course we design, we need to know:

  • What job are we training for?  In other words, what duties or tasks are we teaching someone how to do?
  • What are the standards that we will measure against?  How will we know if our students have learned enough of the right things to perform those duties?  How will we know our training accomplished this?

In my mind, competencies for education are fairly well-defined and adhered to by a very strict accreditation system.  It is relatively easy to accurately measure students’ understanding of geometry, grammar, or DaVinci’s work.  Education provides foundational knowledge; training is the application of that knowledge in a specific situation.  My brother-in-law (a math whiz) was always amazed at how his grandfather used calculus in his machine working job.  But Granddad didn’t actually know calculus; he knew some rules for machining.  My brother-in-law, with his foundational knowledge, can apply what he knows about math to just about any situation.

The difficulty with business training is that job descriptions (and their related competencies) change frequently.  People in those jobs come from varying backgrounds.  Often, people have to adapt to new job requirements because that’s the best thing for the company.  An example would be that of typists.  There’s no such thing as a typing pool any more.  For awhile, typists were converted to word processors (using machines of the same name).  That transition required an entirely new competency: using a computer.

Businesses try to fill the gap between “knowledge/skill” of workers (old, young, new, tenured) and what they need at that moment, with training.  Not only is it difficult to determine what training is required for that gap, it is even harder to measure if the training is effective.  Sadly, it is even more difficult because often the very people in charge of these efforts are not competent in training design or testing!  I’m hoping that with increased emphasis on it from a software view, there will be some attention to the concept itself.

Much the same as when mapping a process, the people who do the job should be involved in the determination of the necessary competencies.  Mind Tools™ has posted an excellent article on the subject, which includes a step by step guide to get it done.  As they say, it will take a lot of effort; effort by the people who actually know the positions.

The US Army is very good at defining job duties and training to them. Every job, at every classification, has defined skills within the MOS system. (Note: this term varies by branch of service, but the structure is very similar.)  Here is an example for a US Army Corp of Engineers Diver for five skill levels.  Notice how this also includes required scores on fitness and written tests, as well as other requirements.  Those developing the training would start with these requirements, not with what they felt like teaching!

I encourage you to read as much as you can about the concept of competency frameworks (start with this Wikipedia article), browse through the Army’s MOS listings (for ideas on how to structure yours), and do your own Internet searches.  To read more on how competency frameworks are critical to the success of your business, visit my blog for earlier posts (such as this one) on testing in a business environment and this one on Purpose-Objectives-Goals for business training.  Future posts are planned for how Moodle supports competency frameworks through grades, scales, and outcomes.

Share

Tags: , , , , , , , , , , , , ,

Thursday, December 30th, 2010

The Year in Review – Using eLearning and Moodle in a Small Business

Listen with webreader

The needs of a small business are different from that of a big business, and different still from those of a university.  Unlike accounting and human resources, eLearning functionality has not been used in small business applications for very long.  Consequently, service providers, advice, and options are much harder to come by.  Even understanding how eLearning can work in your business might be difficult to envision.  

These posts from 2010 offer some ideas on how to use eLearning in general and Moodle specifically, in your small business.  They also provide some guidance on what to look for and what to avoid. 

My picks for best small business advice:

Here’s hoping for a safe and happy 2011. Happy New Year!

Share

Tags: , , , , , , , , , , ,

Tuesday, December 14th, 2010

Demystifying Moodle Quiz Settings Part 3

Listen with webreader

In the first two parts of this series on Moodle quizzes, we covered appearance and strictness.  This post discusses how much and what type of feedback we can provide to the students, with each question and for the exam as a whole. 

Part 3: Feedback Settings 

Review Options 

  • If you want to provide your students with feedback - both your comments and the right answers - check the first column “Immediately”.  If they can attempt the quiz again, obviously, they can use this feedback to get a better grade.  But if you have just one attempt, this is a great way to provide feedback while the questions - and the concepts - are still fresh in their minds. 
  • If you don’t want anyone to know the right answers until the test is closed for good, check the items in the far right column.  The quiz must have a close date for this to occur.
  • If you don’t ever want anyone to know, ever, uncheck all of the items.  

Overall Feedback 

  • Grade boundaries are the maximum and minimum grade received for each comment.  The highest (100%) and lowest (0%) are the default.  You can break that range into as many smaller categories as you wish.
  • Feedback is the text that will appear to the student when the quiz is submitted (if you have this checked in Review Options), according to his grade.  You can be as serious as you like (Excellent!), or silly (You’re so bright I need sunglasses in your presence).  Don’t be afraid to customize this feedback to match your content, both in topic and tone.  A play on words is another form of reinforcement…

The following are not part of the update quiz mode; these settings can be found in the question edit area.  What is displayed to the student is controlled by the Review Option settings. 

Question Feedback 

  • General feedback can be left blank or include graphics, links, and formatted text, using the HTML editor.  This feedback is on the question as a whole, not dependent on the student’s response. Use it to provide more information on the topic (including links and graphics).
  • Most question types provide the option of feedback for each answer.  If you have designed your questions with plausible wrong answers, this is a great opportunity to provide additional explanation on why that answer is incorrect.  Don’t just say “sorry” or “wrong”.  There’s no value in that type of feedback. 

I encourage you to play around with these settings, doing a preview each time.  Be consistent in your settings for each type of test.  To reinforcement concepts, be “lax”.  For final exams that really matter, be “strict”. 

All you need now are some well-written questions!  For more on testing in a business environment, check out these posts:

Go to Part 1: Appearance settings

Go to Part 2: Strictness settings

Share

Tags: , , , , , , , , , , , ,

Tuesday, December 14th, 2010

Demystifying Moodle Quiz Settings Part 2

Listen with webreader

In Part 1, we covered settings that control the appearance of the quiz.  In this post, we’ll discuss the settings that control how much information is provided to the student, and when.  These settings provide us with the opportunity to give “open book” vs. “closed book” exams, “proctor help”, and “instant grading”, all very much like we could do in person.  This gives the Moodle quiz activity tremendous versatility because it can be used as a formal certification exam, an informal pop quiz, or anything in between.

Part 2: Strictness Settings

Timing

  • If you want to force students to take a timed exam, enter the number of minutes in the time limit field.  A really cool countdown clock will appear when the exam is started.  For business training not regulated by professional licensing or other certification rules, you’ll probably want to leave this disabled. Unless you just love the clock…
  • If you allow only one attempt (discussed later), the time between is irrelevant.  If you want to use this quiz to test reliability of your test instrument, you’ll want to put an appropriate delay in here.  

Attempts

  • You can practically give away the answers while still allowing only one attempt, so don’t be disillusioned into thinking that one attempt is the strictest setting.  If you want a measure of question reliability, you’ll need at least two attempts.  If you’re just giving an exam and don’t intend to measure the test itself, keep this at one.
  • Each attempt builds on the last, when checked, shows the student the answer he gave the last time.
  • Adaptive mode, when enabled, tells the student “no, that wasn’t the right answer”, so the student can keep trying until he gets it right.  This mode can also change the question, depending upon what the student submitted as an answer. 
    • In my experience, there is no need for this complexity (and often no one has the skill to do it) in business training.  Do not use this type of quiz unless it makes sense for your content, you can make good use of the information, and you have skilled test question developers to create it.
    • If you use adaptive mode, with no penalties and no change in the question wording, plus useful feedback on each question, you can use this quiz to reinforce concepts.  The grades won’t be of any value, but it can be a good teaching tool. 

Grades

  • With only one attempt, this is irrelevant.  The choices are fairly self-explanatory and I cant think of any “typical” one to advise you to use for business training exams.
  • Applying penalties is to keep people from guessing.  If they leave it blank, they’ll get no credit; if they guess it wrong, they’ll lose points.  I don’t like this choice, ever, because it makes it really hard on me to analyze grades. If you have allowed the adaptive mode (above) you must apply penalties to prevent everyone from getting 100%!
  • The precision of the grades is up to you, but the rule with decimal places is always that one more decimal place than existing in the original data.

You should now be able to create a Moodle quiz activity with the appearance and student difficulty level you desire. To review the basic appearance settings or to learn about feedback:

Go to Part 1: Appearance settings

Go to Part 3: Feedback settings

Share

Tags: , , , , , , , , , , , ,

Tuesday, December 14th, 2010

Demystifying Moodle Quiz Settings Part 1

Listen with webreader

One of the beautiful things about the Moodle Quiz activity is that with a few clicks, you can create a “closed book, timed, seriously strict” exam (assuming your questions are good, too); with a few other clicks, you can produce a fun, silly, interactive memory jogger.  You can use the same questions in different quizzes with different “strictness” settings, having to create each question only once.  You can provide the right answers, with serious or funny feedback, or leave the students wondering if they passed or bombed.

I’ll split this discussion into three posts, according to what the settings control:

Part 1: How it appears to the students

Part 2: How “strict” it is on the students

Part 3: How much feedback is given to the students

What you choose for each setting depends on your overall training objectives and the purpose of each Moodle quiz you create.

Part 1: Appearance Settings

General 

  • The name you give it will appear in the course outline, so give it a meaningful name.
  • In the HTML editor you can create whatever you want your students to see.  I try to put a nicely formatted description in all quizzes, like this:  [click here for an example]
  • Timing 
  • If you have an ongoing, self-paced course, disable both the open and close dates this section.  If your course has a start and end date, your quiz available dates should correspond to the timeline of your syllabus.  

Display 

  • Everything I have read about this says “5″ is the best number of questions per page.  This is to reduce the load on the server. 
  • Shuffling is good if you think someone has this in his sleeve: 1.a, 2.b, 3.e, 4.c, 5.f…  It’s also useful if you’re doing a study where you’re trying to randomize the effect of the question order.  For most business applications, shuffling of questions or answers is not necessary.  

Common module settings 

  • The Group mode is the same as with all other Moodle activities.  If you don’t have groups set up in your course or if you want everyone to take the same quiz, regardless of group, leave this at no groups.
  • Visible is obvious.  If you want students to see it, you need to show it.
  • Grade categories are methods of aggregation (average, total, worst, highest) of the individual grades.  Frankly, I never use this.  I dump it all into Excel® and from there I do simple calculations and graphs; if I want more serious analysis (which I often do), I export it to Minitab®
  • If you set the ID number to something, you’ll have that as an extra field in your data file. 

Security 

  • Browser security is an attempt to stop cheating, but as the help file indicates, it isn’t simple.  I never, ever check this.
  • I’ve never quite seen the need for a password in the quiz, since the user has to have logged in to take it. 
  • The last option in this section is used only if you want to restrict where your students can log in from when they take the quiz.  If you want them to be at their desks, not in their living rooms, you’ll want to enter your company IP addresses here.  This is especially useful if there might be classified or sensitive information in the quiz. 

At this point, you have enough information to set up a Moodle quiz, using the defaults on the other settings.  You will, of course, have to upload or enter questions. That is not covered in this post. 

Go to Part 2: Strictness settings

Go to Part 3: Feedback settings

Share

Tags: , , , , , , , , , , , ,

Tuesday, August 3rd, 2010

Creating Purpose-Objectives-Goals for a Business Training Course

Listen with webreader

In my post on creating course outlines, I wrote that two pages in any course should be the Purpose-Objectives-Goals (POG) page and the Summary page. That sounds simple enough, right? Well, maybe not…

What is the POG for a business training course? Is it the same as it would be for a university course? Does it come from the business case for delivering the training? Is it related to the mission of the business?

Let’s start with some assumptions:

  • Business training differs from academic education (note the different uses of “training” and “education”).
  • Academic education seeks to impart not just information to students, but to equip them to think about new scenarios, to integrate ideas, and to build upon their education as they experience life. This is done through a foundation of knowledge. We don’t simply learn that “2+2=4”, but why it’s so.
  • Business training, while sometimes is for the sole enrichment of employees, is usually targeted to improve a business metric. Or, it is intended to be.

A while back, I wrote a post on assessing the effectiveness of business training. I have observed a huge gap between the intended or desired outcome of business training and what it actually delivers. My “hypothesis” (which I have empirical evidence to support) is that this gap exists because of three things:

  • The lack of proper evaluation of training effectiveness
  • The failure to align training objectives with business objectives
  • The failure to create and deliver training to the objectives, if they had been aligned in the first place

The first item is discussed in the post; the third item is a deeper subject known as instructional design. This post addresses the second item, aligning training objectives with those of the business.

So, how do you align training goals and objectives with the goals and objectives of your business?

  1. First, understand your business goals and objectives. Where are your “problem areas”? What do you want to improve? Where do you want to reduce risk? Some likely business examples:
    • prevent accidents
    • reduce errors
    • improve customer service
    • improve efficiency
    • reduce waste
    • improve communication
    • reduce time to market
    • leverage knowledge
    • protect intellectual property
    • improve work environment (physical)
    • improve quality of work (emotional)
    • increase promotion opportunities
    • increase market share
    • reduce redundancy/confusion from department to department
  2. Second, understand to at least some degree, who in the organization can affect these goals. Your delivery driver might have a strong influence on several goals, but she isn’t going to have anything to do with reducing the time to market of a new product. A RACI chart would be a useful tool for this.
  3. Based on the RACI chart, decide what level of training should be provided to each position in topics aimed at achieving each goal. Bloom’s rose would be a great reference for this.
  4. Determine what those topics, tools, and methods are. You will need to seek the assistance of subject matter experts to accomplish this.
  5. Create a curriculum (map out all of the training).
  6. Write a POG for each course in that curriculum. Please note that the terminology can be highly variable.. I’ve seen many instances where Goals were defined as more general than Objectives. Still others use them interchangeably or use completely different terminology. It doesn’t matter. The important things are that you use the terminology consistently, in a manner that your students understand, and that these three words combine to define the scope of the course.

Purpose (a.k.a. Aim): These statements should be formulated with phrases similar to these: “to provide an overview of…”, “to provide the framework for…”, “an in-depth discussion of…”, “to advance the knowledge from Course 101”, “to apply knowledge to field examples in…”.

While Control Charts have a solid history of use in manufacturing, they are excellent tools for use in monitoring and controlling transactional processes as well. This course demonstrates the construction and use of control charts, providing both scenarios and corresponding example control charts.

(Learning) Objectives: These are essentially from Bloom’s categories (Cognitive domain) and more specific than the purpose of the course. There are usually a few objectives.

1. Explain the purpose and proper use of control charts.
2. Introduce the six basic types of control charts.
3. Provide examples of how control charts can help stop trends and identify potential problems in the processes.

Goals (a.k.a. Learning outcomes): SMART goals directly related to the objectives.

It is important that you leave this course knowing:
1. Which type of control chart is best suited to different situations.
2. How to construct and use a control chart.
3. How control charts fit into larger quality initiatives.

These examples are taken from the SPC 101 course at BeeLearn.com. They are not perfect. Yours probably won’t be, either. But, they are “good enough” to define the scope of the course, set expectations, and to build content around.

The Summary page of every course should tie back to the POG. The course exams and activities should be built in support of the POG. The content should be built to the POG. If you do this, you’ll have created a course that serves a purpose; to make your business stronger by providing training that is aligned with and effective at meeting your goals and objectives.

Share

Tags: , , , , , , , , ,

Tuesday, April 13th, 2010

Client Spotlight: Charity Uses Moodle to Reach More People

Listen with webreader

From the April 2010 issue of Penny For Your Thoughts newsletter…

spotlightSmall businesses are not the only entities that can benefit from eLearning.  In this post, our spotlight is on how Moodle has enabled a charitable organization to continue to offer their grief counseling workshop, even in tough economic times.  The Association for Pet Loss and Bereavement (APLB) was founded in 1997 by Dr. Wallace Sife, author of The Loss of a Pet.  For many years, he had delivered his 10-hour workshop, Pet Loss Counselor Training, in the New York City area.  As travel became more difficult and budgets were cut, Dr. Sife began to search for an online alternative. Reliant on volunteers and donations, APLB could not afford an expensive IT solution.

In 2009 he selected Moodle as his delivery platform; Albany Analytical created the course and managed enrollments.  The first two sessions (summer and winter) have seen participants from more than a dozen states/provinces and three continents; such diversity would likely not have been possible without the online solution. 

Relying more on solid content than whiz-bang bells and whistles, the APLB course successfully graduated 33 in its first year.  With Dr. Sife’s commitment to excellence, each participant’s competency was assessed on an individual and thorough level.  Because of the nature of the material, all testing was accomplished through submitted essays, designed to measure understanding and ability to perform pet loss counseling.  The Moodle features of a participant forum, live links to relevant sites (such as the APLB newsletter archive), downloadable documents, and a calendar to remind participants of impending due dates, all supported the learning experience.

Dr. Sife plans to conduct these online workshops twice a year, as well as his traditional face to face session every other year (the next one is May 21). Many more people were able to participate in this workshop this past year as would have without the online version.  Moodle eLearning has allowed the APLB to continue to offer this important training even though a tough global economy prohibited travel by most potential participants.  We look forward to the continued success of this program.

Share

Tags: , , , , , , , , , , , , ,

Monday, April 5th, 2010

Assessing the Effectiveness of Business Training

Listen with webreader

Whether you are the department head paying for it, or the consultant delivering it, justifying the expense of training is more important now than ever.  While it might not be easy to quantify the long term financial benefits of a trained workforce, it is possible to demonstrate knowledge, skill, and practical application gained from a specific training workshop. 

While there is no SAT or any other standardized test to administer in most business training, we’re so programmed to take (and give) tests, every business training course seems to end in one.  Typically, everyone passes with flying colors.  Does this mean the training was effective?  No.  It doesn’t even mean that any knowledge was transferred.  What we really want to know is if this training changed the way people do their jobs, which is the whole point of business training!  This is rarely measured or tracked, but it can (and should) be.

During the workshop, test knowledge and the ability to apply it, in a variety of ways: 

  • Administer well-written tests, not just tests.  These tests should be developed by people who know how to write valid tests; being a great trainer does not always lead to being a good test writer.  Knowing the subject upside down and sideways is no guarantee of test expertise.
  • Scatter quick quizzes (written or verbal) throughout the workshop.
  • Conduct group activities with a skilled observer who can assess whatever qualities you’re interested in; leadership, ability to work together, problem solving, time management, etc.
  • Include written assignments, to be graded for content as well as writing ability, if writing is an important skill in using the workshop knowledge.  If writing isn’t important, don’t make the students write!
  • Include a speaking assignment (such as a project presentation) if it’s important to be able to use this new knowledge.  If it is important, grade it.  If it’s not important, leave it out!

The most valuable assessment comes after the workshop.  This is where the knowledge is applied to the job, which is where the benefits lie. 

  • Coaching (usually by the trainer) to guide the participants through the first-time application of their new-found knowledge is not only a valuable assessment tool, but it generally improves the effectiveness of the training.
  • Outside of class assignments – graded by the trainer/coach – are a great way to assess the ability of the participant to apply what was taught in the workshop
  • Follow-up surveys (to obtain self-perception) in conjunction with project/work audits can be used to measure actual implementation of the workshop knowledge.

Stay tuned for future posts to include examples of good tests and surveys and how to improve the effectiveness of training (once you’ve measured it, you can make it better!).

Share

Tags: , , , ,

Thursday, March 25th, 2010

eLearning Tests and Surveys – What Tools Do You Need?

Listen with webreader

Sometimes, you just get lucky.  I could not figure out how to write this post in fewer than 5000 words, many of which would be red, bold, and in uppercase letters.  Then I came across this post by Connie Malamed at  The eLearning Coach.  So, now I don’t have to write about how to write a good question!  Everyone should heed her advice before even considering what tool to build the questions in.  All the beautiful technology in the world won’t matter if the questions suck.  And, unlike an ugly graphic or a boring video, a bad test question can harm the learner.  A poor test that is used to judge an employee’s job-worthiness is never going to be a good thing.

So, I’ll assume you have some great assessment questions – for quizzes, tests, and surveys – that are reliable and valid and that you want to use them in your elearning courses.  Let’s look at them in terms of the Five Basic Things to consider about authoring tools:

Will assessments add value to my training design? Yes. Every eLearning course needs some assessment method.  At a minimum, you’ll want to know if your students perceived that they got something out of it.  That’s a feedback survey, not a test, but the question development and delivery are the same.  You’ll also probably want to know if the students learned anything and/or demonstrate some level of competency.  This tells you not only just that – what they learned – but also how good your course was at teaching them. 

Do I have the skill? This is a two-parter.  The skill to write questions is one thing.  It’s the main thing. The skill to build questions in an online application is another.  Luckily, the latter is remarkably simple because of some of the fantastic options available. 

 What are the options?  I build all of my tests in Moodle so I don’t use many of the available tools.  But, I also use Engage as a “pop-quiz” maker. If I don’t care what the answers are and I just want to reinforce a concept, I think this is a better option than a “flat test”.  Moodle has a Choice activity that captures the data but doesn’t look as snazzy.  Articulate also offers QuizMaker, which is about twice the price of Engage and does quite a bit more.  Joomla QuizForce is similar to QuizMaker and comes with the LMS.  At a hefty $800, Adobe Captivate does many things, including flash-based quizzes. If you don’t have a good assessment feature in your LMS or you want something customized that is in HTML and not flash, try Drupal Webform.  I have an example here.  Or, you can code one from scratch! (Yeah, right). Be creative.  You can use a quiz as a survey and vice versa. 

How much functionality do you need from this tool? The answer to this depends on what your objective is.  If you want to wake up your students with a flash presentation in full color and sound, you’ll want to opt for one of the flash applications.  If you want some serious data collection that dumps to a file that you can do statistical analysis on, pay more attention to the results than the delivery methods.  That usually means not using flash.  If you need to prove competency or that the student took the test herself, you’ll need to make sure that you can verify log in and track time spent, as well as grades.

Will this tool work within my LMS? To be assured that it will work, use the one that is built into your LMS (Moodle, Joomla, Blackboard, etc.)  That will ensure that you can track all the vital student statistics.  Check to make sure that your LMS will accept flash applications and/or allow you to open external websites before using those tools.

I believe that all instructors should want feedback on their courses and use tests to judge themselves as well as students in a constant effort to improve.  But, it if you own a business that offers eLearning to clients or employees, there is a financial reason to have good test and survey questions.  Knowing what level of competency your employees have, which training works and which doesn’t, and having the ability to get feedback on the experience is critical to long term success.  Remember, what matters most is that you start with good questions.  Good information, good questions, lots of variety…GREAT eLearning.

Share

Tags: , , , , , , , , , , , , , , ,

Saturday, January 30th, 2010

Clarification is just around the corner!

Listen with webreader

Hang in there…by the end of this very short month (February), I’ll be posting all kinds of useful information, in just plain language, to help you answer all those burning questions about eLearning for your business.

In the meantime, please visit AlbanyAnalytical.com  to find some answers and read my eBook, Moodle e-Learning: Questions and Simple Answers about Online Training.

Penny Mondani, Moodle Maven

Share

Tags: , , ,

  • LinkedIn LinkedIn Facebook LinkedIn newsletters
  • Archived Posts
  • Archived Newsletters
  • Sign up for Albany Analytical Newsletters
    * = required field
    I would like to receive the following newsletters:


  • Test

    Testing Sidebar 2

© 2010, All rights reserved, Albany Analytical, Inc.

Blossom Theme by RoseCityGardens.com

/***Google Analytics Code ***/ /***End of Google Analytics Code ***/