Effective learning environments do not happen by accident. They are the result of deliberate decisions about which tools to use, how to assess whether those tools are working, how to manage the infrastructure that keeps them running, how to ensure they operate ethically, and how to design them so that every learner, regardless of background, experience, or ability, has a genuine opportunity to succeed. Standard 3 asks candidates to demonstrate competency across all those dimensions, and it is the standard that most closely mirrors the day-to-day work of an instructional designer in a real professional setting.
The five artifacts I selected for this standard reflect different moments in my development of that thinking. My Virtual Reality review of CoSpaces Edu demonstrates my ability to use research and evaluation frameworks to make informed technology selection decisions, asking not just 'does this tool exist?' but 'does it actually serve the learner, and under what conditions?' My usability testing report on Memrise extends that evaluative thinking into a formal assessment, showing how structured data collection can improve the design of a learning environment from the inside out. My communication plan for the Channel Partner Training project addresses the often- overlooked management side of learning environments, the coordination infrastructure that keeps complex instructional initiatives on track. My research ethics document from EDF 6481 demonstrates that I understand ethical responsibility as a foundational design constraint rather than an afterthought. And my ARCS Motivational Design Workbook shows how I approach diversity, not as a compliance checkbox, but as a genuine analytical lens that shapes every instructional decision from audience profiling through final design.
Together, these artifacts reflect my belief that a learning environment is only as strong as the thinking behind it, and that thinking must account for technology, evidence, management, ethics, and the full diversity of the humans it is meant to serve.
Candidates make professionally sound decisions in selecting appropriate processes and resources to provide optimal conditions for learning based on principles, theories, and effective practices.
Course: EME 6055 Technology in Education
Date: Fall 2024
Artifact: Virtual Reality (VR) Review
Role: Sole Creator
Project Type: Assignment
This artifact is a structured evaluation of CoSpaces Edu, a web-based virtual reality platform designed for educational use, completed as a weekly activity in EME 6055: Technology in E-Learning. The review evaluated the platform across eight dimensions: availability of pre-existing content, ability to create new content, safety and privacy, instructor training and support, alignment with learning and motivational theories, comparison to computer-based learning environments (CBLEs), appropriate use with students, and a sample lesson plan.
CoSpaces Edu was assessed as a platform supporting constructivist and experiential learning theories, with VR immersion encouraging curiosity through elements consistent with self-determination theory and game-based learning principles, including immediate feedback and reward systems. The review included a three-day middle school English Language Arts lesson plan in which students used CoSpaces Edu to design and present 3D storytelling projects. Beyond the K-12 context, the review extended its analysis to a corporate setting, proposing how VR could be applied in cybersecurity training to simulate real-world threat scenarios, such as identifying vulnerabilities in a virtual network or practicing incident response protocols, in a safe, risk-free environment with immediate feedback. Privacy and safety considerations were addressed, noting that CoSpaces Edu is both COPPA- and FERPA-compliant, with strong access controls for verified users.
3. Assess Target Audience Characteristics
Distinguished between the needs and learning of K-12 students and corporate sales and security professionals, recognizing that the same technology requires fundamentally different design approaches depending onthe audience.
4. Assess Relevant Characteristics of the Setting
Evaluated the platform's fit across two distinct learning settings, a K-12 classroom and a corporate cybersecurity environment, identifying how contextual factors such as device availability, instructor training capacity, and learner population shape whether VR is appropriate.
7. Select Instructional Media
Applied a structured evaluative framework to assess CoSpaces Edu's instructional utility across multiple dimensions — content, safety, theory alignment, and comparison to CBLEs, demonstrating the ability to make evidence-based media selection decisions rather than defaulting to novelty.
8. Recommend Instructional Strategies
Designed a three-day VR-integrated lesson plan for middle school ELA and proposed a corporate VR simulation framework for cybersecurity training, demonstrating the ability to translate technology evaluation into concrete instructional recommendations grounded in learning theory.
Candidates use multiple assessment strategies to collect data for informing decisions to improve instructional practice, learner outcomes, and the learning environment.
Course: EME 6208 Interactive Media
Date: Spring 2024
Artifact: Usability Testing Report
Role: Sole Creator
Project Type: Final Project
This artifact is a formal usability testing report evaluating the Memrise mobile language-learning application with three participants representing diverse professional backgrounds, ages, and prior experience with language-learning tools. Sessions ran 25 to 35 minutes each and were conducted both in person and via Microsoft Teams. Participants completed seven structured tasks covering core app functions: account creation, language selection, language switching, audio settings, preference changes, profile photo update, and password reset. The administrator collected quantitative data across four dimensions: task completion rates, time on task, clicks per task, and error counts, and followed up with a 5-point task difficulty questionnaire and a 19-item, 7-point satisfaction survey covering ease of use, efficiency, learnability, clarity, and overall experience.
Results revealed that while six of the seven tasks achieved 100% completion, Task 3 (changing the language) produced the highest error rate (12.3 average errors), the longest completion time (151.6 seconds average), and the most clicks (15.3 average), indicating a significant navigation design failure. Task 7 (password reset) achieved only 33% completion due to a Single Sign-On conflict that prevented two participants from resetting their passwords. The report concluded with two high-severity design recommendations: adding language switching to the settings menu and adding SSO-aware password reset guidance, each justified by the empirical data collected during testing.
1. Perform a Needs Assessment
Identified specific, data-supported usability gaps in the Memrise environment, particularly the language navigation and SSO password reset failures, through systematic observation and participant data, mirroring the discrepancy-identification process of a formal needs assessment.
3. Assess Target Audience Characteristics
Recruited three participants with deliberately varied profiles (age, occupation, prior app experience, language background) to capture a representative range of user perspectives and ensure findings were not skewed by a homogeneous sample.
9. Develop Performance Measurement Instruments
Designed and administered both a task difficulty questionnaire (5-point scale) and a satisfaction survey (7-point, 19-item scale), demonstrating the ability to construct valid measurement instruments tailored to the specific evaluation goals.
12. Evaluate Instruction, Program, and Process
Conducted a complete formative evaluation cycle: designed the study, recruited and observed participants, collected and analyzed both quantitative and qualitative data, identified high-severity issues, and generated prioritized, evidence-based design recommendations.
Candidates establish mechanisms for maintaining the technology infrastructure to improve learning and performance.
Course: EME 6235 Technology Project Management
Date: Fall 2025
Artifact: Communication Plan
Role: Sole Creator
Project Type: Assignment
This artifact is the formal communication plan developed as part of the Channel Partner Technical Training and Certification Program project at Veracode, completed in the Project Management course. The communication plan defined the governance structure for all project communications across the initiative, establishing how information would flow among the core project team, stakeholders, subject-matter experts (SMEs), leadership, and channel partners throughout the training development lifecycle.
The plan specified three communication goals: aligning all go-to-market (GTM) teams on project milestones and blockers, ensuring transparency and timely updates for stakeholders, and facilitating consistent feedback loops with SMEs and leadership. It established a tiered communication cadence, weekly standups and SME syncs, biweekly stakeholder status reports, and monthly leadership KPI reviews and milestone summaries. Four technology tools were designated for specific communication functions: Slack for daily updates and sprint coordination, Monday.com for task tracking and dependency management, Zoom for synchronous meetings, and Highspot for content sharing and certification updates. The plan also identified and addressed anticipated challenges, including SME availability constraints, varying communication preferences across teams, and cross-time-zone coordination with LATAM partners.
2. Plan and Monitor Training Projects
Developed a comprehensive communication governance structure, defining meeting cadences, reporting formats, tool assignments, and challenge mitigation strategies, that directly supported the monitoring and coordination of a multi-stakeholder training project.
4. Assess Relevant Characteristics of the Setting
Identified and addressed real contextual challenges of the project environment: SME availability constraints, cross-functional team communication preferences, and international time zone coordination with LATAM partners, integrating those realities into the plan's design.
7. Select Instructional Media
Made deliberate technology tool assignments, Slack for async coordination, Monday.com for task management, Zoom for synchronous reviews, Highspot for content distribution, matching each tool to a specific communication function rather than defaulting to a single platform.
10. Develop Training Program Materials
Produced the communication plan as a formal project governance deliverable, demonstrating the ability to create structured documentation that supports and sustains training program implementation, not just content design.
Candidates foster a learning environment in which ethics guide practice that promotes health, safety, best practice, and respect for copyright, Fair Use, and appropriate open access to resources.
Course: EDF 6481 Foundations in Educational Research
Date: Spring 2026
Artifact: Research Ethics Document
Role: Sole Creator
Project Type: Assignment
This artifact is a graduate-level research assignment that addresses the ethical design of a proposed study examining AI-powered role-play simulation as a corporate sales training tool. Task 3 focused specifically on participant sampling and the ethical obligations involved in conducting research within a workplace learning environment. The proposed study targeted sales representatives at a cybersecurity company who would use an AI roleplay bot as part of objection-handling training, with an anticipated sample of 30 to 50 participants obtained through convenience sampling aligned with natural training cohorts.
The document provided a thorough analysis of the ethical considerations inherent in this research context: voluntary participation, informed consent, data confidentiality, de-identification of AI-generated performance scores and roleplay transcripts, and the power dynamics that arise when an employer-affiliated study is conducted with employees. It explicitly addressed the need to separate research participation from job performance expectations to prevent coercive participation. It also engaged substantively with diversity and equity considerations, noting that participants from varied cultural backgrounds, first languages, and experience levels would require clear, jargon-free survey instruments and that diversity should be treated as a meaningful
contextual variable rather than a methodological inconvenience.
3. Assess Target Audience Characteristics
Identified and described participant characteristics — experience level, cultural background, linguistic diversity, and organizational position — and directly connected those characteristics to specific ethical design decisions about consent, instrument clarity, and data handling.
9. Develop Performance Measurement Instruments
Addressed the ethical requirements for designing data collection instruments in a workplace setting, including ensuring survey clarity for non-native English speakers, protecting participant responses from managerial visibility, and piloting instruments before full implementation.
12. Evaluate Instruction, Program, and Process
Designed an ethical evaluation framework for assessing an AI-supported training intervention that treats participant safety, confidentiality, and equity as non-negotiable constraints on the research and evaluation process, not trade-offs to be managed.
Candidates foster a learning community that empowers learners with diverse backgrounds, characteristics, and abilities.
Course: EME 6491 Motivational Design
Date: Summer 2025
Artifact: ARCS Motivational Design Workbook
Role: Sole Creator
Project Type: Assignment
This artifact is the completed seven-worksheet ARCS Motivational Design Workbook developed for EME 6419, applied to a real corporate sales enablement course, 'From Risk to Resolution: Mastering the Veracode Value Pitch', designed for Veracode's sales team. While this workbook also appears in the Standard 2 section, it is included here under S.3.6 because its audience analysis and design approach specifically and systematically address learner diversity as a central design variable rather than a secondary consideration.
Worksheets 2 and 3 of the workbook constructed a detailed learner profile that documented meaningful differences across the participant population: variation in sales experience levels (from brand-new hires to experienced representatives), differing motivational attitudes (performance-driven optimism versus skepticism from those who had sat through ineffective training before), variation in prior exposure to the company's messaging frameworks, regional and team differences affecting peer familiarity, and cross-cultural factors relevant to a global sales organization. The audience analysis identified two distinct motivational profiles, one group with low attention readiness and one with high intrinsic motivation, and designed ARCS tactics specifically calibrated to reach both. The final design plan (Worksheet 7) addressed this diversity through
scaffolded practice, retry-without-penalty simulation environments, multiple pathways through content, and
competitive and non-competitive engagement options that recognized different learner preferences.
1. Perform a Needs Assessment
The workbook's systematic analysis of motivational gaps, identifying that confidence and attention were the primary challenges for the learner population, constitutes a formal motivational needs assessment, with findings directly shaping the instructional intervention.
3. Assess Target Audience Characteristics
Worksheets 2 and 3 produced a comprehensive learner profile documenting variation in experience, motivation, attitude, peer familiarity, regional context, and learning preferences, directly operationalizing this competency for a real and diverse corporate audience.
4. Assess Relevant Characteristics of the Setting
Considered the LMS delivery environment (Highspot) and how its features, self-paced navigation, AI simulation, leaderboards, and branching scenarios, could be configured to accommodate diverse learner preferences
8. Recommend Instructional Strategies
Translated the diversity findings from the audience analysis into differentiated ARCS tactics: scaffolded practice for low-confidence learners, competitive leaderboards for high-motivation learners, retry-without-penalty simulations, and multiple engagement pathways, all calibrated to specific learner profiles identified in the analysis.