Home  Products & Services  Academic Credit & Recertification  Solutions for Schools  Sales & Support  Contact Us
 





  • Overview
  • Philosophy
  • Benefits
  • Testimonials
  • Process
  • Evaluator Training
  • Courses
  • Download Brochure
  • FAQ


  • Overview - Strategic Recertification
  • Benefits
  • Evaluator Training
  • Professional Development
  • Download Brochure
  • FAQ








  •  
    TIP•R FAQ's

    Below are questions often identified by schools and divisions as they plan their implementation of the JMU NETS•T Certification Program.  The questions are relevant to the implementation of the JMU TIP?R Program as well.

    ?

    Should the Evaluator and the person being evaluated be acquainted with one another?

    While the absence of personal knowledge of the Evaluator and person being evaluated may have some advantages (e.g., objectivity), there is more to be gained when the two parties do have a professional relationship that extends beyond the NETS•T evaluation context. This is particularly the case with regard to professional development. While the JMU NETS•T certification program has a strong assessment component, it is in equal measure a program to promote professional development. Professional development may be promoted best when the Evaluator has responsibility for the professional development of person being evaluated such as is the case with a division Technology Coordinator and an ITRT or an ITRT and a teacher in a school served by the ITRT.

    ?

    Should Evaluators be skilled in the Evaluation of all rubrics or specialized to address only a subset of rubrics, particularly for divisions in which there are multiple Evaluators?

    This is an option available only to those Evaluators working in divisions that have multiple Evaluators. Even in those cases, however, it is preferable that Evaluators have deep awareness of all rubrics and requirements, particularly since there is some overlap in the skills addressed by rubrics and, conversely, some evidence may be suitable for multiple rubrics.

    ?

    When a submission is rated at less than Meets, should subsequent ratings be done by the original Evaluator or a new Evaluator?

    The Evaluation process is an on-going dialogue that requires knowledge of previous submissions and exchanges (e.g., feedback provided by Evaluator) between both parties in the process. As such, switching Evaluators during the consideration for a given rubric is not recommended. However, different Evaluators may be engaged in evaluation of different rubrics for the same educator, subject to local preferences.

    ?

    For divisions that have more than one Evaluator, how should evaluations be assigned?

    In the case where a division has multiple Evaluators, the Evaluators may decide among themselves which submissions to accept. All submissions for a given division are listed on that division's web page. The Evaluators can view the evidence and decide which to accept for themselves, based on whatever criteria they decide internally.

    ?

    Should Evaluators be paid for their Evaluator services?

    In the SVTC, Evaluators had funds available to pay for their services, but voted every year of the grant not to allocate funds for this purpose, preferring instead to allocate those funds to direct teacher support (e.g., hardware incentive). The Evaluation role is very time-consuming to be sure, but the Evaluators felt that it was their responsibility to help provide professional development for their teachers and as such, viewed NETS•T evaluation as an extension of their regular job responsibilities. In any event, individual VCOPs are free to make whatever decision they wish in this regard.

    ?

    Should edits to submissions be allowed?

    The SVTC decided that allowing edits to submissions would create the potential for confusion. So, they decided that once the evidence is submitted, it becomes an official record of the teacher's portfolio and cannot be changed. Typically, when educators want to edit or otherwise revise a previous submission, they simply submit the revised document or artifact.

    ?

    Should issues of spelling, typos, or grammar detract from the submission?

    The teachers' artifacts and reflections are considered to be professional documents, comparable to any other official document. As such, the evidence should be free from any errors that would detract from its professionalism.

    ?

    How should submissions (e.g., work samples) that include personally identifiable information regarding students be handled?

    The SVTC felt very strongly that any evidence containing personally identifiable student information should be rejected automatically and not reviewed. A reject button was built into NETS•T system for this purpose.  However, some school divisions may have policies that differ on this subject.  So, to be sure, check the policies in effect in your divisioin.

    ?

    What file types will be supported for submissions?

    In order for Evaluators to review artifacts, they must have the software available that supports the formats in which the artifacts were created. So, accepted formats are the more widespread ones, including .doc (Word) .pdf (Acrobat reader) for documents, and .jpg for images. More obscure formats, like that for AppleWorks are not allowed.  Evaluators may also allow some discretion in the types of file formats they allow.

    ?

    Can teachers and ITRTs work together in developing their evidence?

    Teachers and ITRTs are encouraged to provide mutual support and guidance to others engaged in the NETS•T process. However, the actual effort involved in creating the artifacts and reflections must be done by the person submitting the evidence. It is permissible in some instances to submit evidence created by others, but the material must be properly attributed and support original work completed by the submitter. For example, a teacher may submit an exemplary lesson created by someone else but must provide evidence of actual implementation of the lesson (e.g., student work samples) in his or own classroom.

    ?

    How can we be sure that the evidence submitted by a teacher is the result of work performed by that teacher?

    A professional honor code is assumed to be in effect for all educators engaged in the JMU NETS•T process, for teachers and ITRTs, as well as for Evaluators. Further, random spot checks of submissions and evaluations are conducted on an on-going basis to determine if there are any inconsistencies in procedures. Finally, the system automatically performs pattern recognition to determine if there are any anomalies in evidence submitted or in evaluations conducted.

    ?

    What kind of help is available to Evaluators if they are unsure as to what the most appropriate rating for a submission?

    Discussing questions regarding evaluation with other Evaluators is a very valuable way to learn the finer points of Evaluation. The primary means of support in this regard is provided by the other Evaluators who serve in a multi-Evaluator VCOP. Another useful means of support, particularly for Evaluators who are the sole Evaluator for a VCOP, is the Second Opinion feature built into the Evaluator system. This feature allows an Evaluator to email the evidence of interest to other Evaluators who can then provide feedback based on their own experience and insight.

     

    "[JMU] NETS•T certification has been the most valuable professional development experience of my career.

    Kelly Lineweaver, Manager Shenandoah Valley Technology Consortium



  • Obtaining Academic Credit from James Madison University




  • Instruction
  • Achievement
  • Impact


  •  
        About Us
  • Mission
  • Faculty
  • Leadership Team
  • Careers
  •   Partners
  • James Madison University
  • Become a Partner
  •   Customer Service
  • 571-29-JMPE-1 (571-295-6731)
  • info@jamesmadison education.com
  •   Site Map
      Legal Notices
  • Terms of Service
  • Privacy Policy
  • Trademarks
  • Academic Integrity

  • Copyright © 2003-2017 James Madison Partners for Education. All Rights Reserved