Instructional design models emerged in the 1960s and have since proliferated across instructional technology journals and educational literature (Branch and Dousay, 2015). These models function as conceptual frameworks and organizational guides, enabling IDs to analyze, construct, and assess structured learning systems. These models offer a diverse set of functions that support clarity in planning, coordination, and evaluation, whether applied to expansive educational settings or specific training tasks.
Why use models? They allow us to conceptualize complex realities through simplified representations. A model reduces intricate processes, structures, or abstract concepts into more manageable forms. Since reality often presents overwhelming variability and situation-specific elements, models focus on what remains consistent across settings. Seel (1997) classifies instructional design models into three categories: theoretical or conceptual, organizational, and planning-prognosis. The models discussed here fall under organizational models, designed to function as broad, adaptable guides for instructional planning (Branch and Dousay, 2015).
INSTRUCTIONAL DESIGN MODELS BY PHASE
| ID Models | Analysis | Design | Development | Implementation | Evaluation |
|---|---|---|---|---|---|
| Gerlach & Ely (1980) | Assess entry behavior; Identify content & objectives | Determine strategy, organize groups, allocate time and space, select resources | Prepare instructional materials | Implement strategies and learning groups | Evaluate performance; analyze feedback; revise |
| ASSURE | Analyze learners, traits, learning styles, competencies | Write objectives; plan strategy, media, and activities | Select, adapt, or create materials | Utilize materials and engage learners | Evaluate outcomes and revise instruction |
| Morrison, Ross, Kalman, Kemp | Identify learner characteristics and instructional problems | Set objectives and sequence content | Develop materials and project components | Deliver structured instruction | Evaluate and revise using feedback |
| PIE Model | Gather data on learning needs and media use | Create lesson plan with technology | Build instructional tools and content | Execute instruction and media strategy | Evaluate learning and revise |
| 4C/ID | Analyze learning tasks and performance goals | Design whole tasks and supportive information | Develop learning materials and simulations | Deliver tasks progressively with scaffolds | Implicit through task performance |
| Wiggins & McTighe (UbD) | Identify desired results | Plan assessments | Sequence learning experiences | Deliver flexible units | Evaluate alignment and outcomes |
| Agile Model | Task and instructional analysis | Design strategy and system flow | Develop learning materials | Implement and train | Continuous evaluation |
| IPISD | Analyze roles, tasks, and gaps | Structure outcomes and course flow | Develop instructional packages | Train instructors and deliver content | Internal and external evaluations |
| Gentry | Prioritize goals using needs analysis | Define objectives, strategies, and media | Develop prototypes and final content | Manage delivery systems | Gather data and improve instruction |
| Dorsey, Goodrum, and Schwen | Establish user needs and shared vision | Create conceptual prototypes with user input | Iterate prototypes through rapid cycles | Implement stabilized solutions gradually | Evaluate through continuous usability testing |
| Diamond | Assess needs and institutional priorities | Plan formats and content development | Produce and field-test materials | Coordinate and implement instruction | Evaluate and revise for effectiveness |
| Smith and Ragan | Analyze learners, context, and tasks | Generate instructional strategies | Produce instruction | Prepare guides and materials | Formative and summative evaluation |
| Dick, Carey, and Carey | Assess needs and analyze learners | Write objectives and assessment tools | Develop aligned instructional materials | Deliver refined instruction | Iterative formative and summative evaluation |
| Merrill – Pebble in the Pond | Identify skills and authentic tasks | Design task-centered strategies | Develop expanding instructional layers | Apply learning in real-world contexts | Gather feedback and revise continuously |
Classroom-oriented instructional design models align closely with day-to-day teaching. Educators work within fixed schedules, limited class time, and defined responsibilities. This reality shapes how they plan instruction. These models also appear in some corporate training programs that follow similar constraints. Because time and resources remain limited, teachers often revise existing materials instead of creating new ones. Many subjects run once a year, which reduces the need for repeated evaluation or constant updates. These models offer broad guidance that helps instructors organize lessons and manage delivery. Although they lack formal recognition in many settings, several remain practical and accessible. Developers who support classroom instruction must account for this reality, especially since some educators view instructional design as overly procedural. Models such as Gerlach and Ely, Heinich et al., Newby et al., and Morrison et al. continue to function well in typical classroom conditions.
Product-oriented instructional design models center on creating standalone materials for learners who work independently. These models assume the development of original content, repeated testing, and attention to usability. Learners often rely on facilitators or systems instead of instructors. Some rapid prototyping methods allow early feedback, though interaction often stays limited. These models fit corporate training and technology rollouts where requirements already exist. As online and distance learning expand, demand grows for scalable and well-designed instructional products. Examples include models proposed by Bergman and Moore, De Hoog and colleagues, Bates, Nieveen, and Seels and Glasgow.
Classroom-, product-, and system-oriented models support learning at different levels. Each serves a specific scale and purpose. Classroom models fit educators working within time limits and resource constraints. Product models support independent learning and emphasize usability and refinement. System models address broader instructional efforts and require coordinated teams and institutional alignment. All three depend on deliberate design thinking and careful planning.
Instructional design models offer different ways to plan and deliver instruction. Each reflects assumptions about context, scale, and learner needs. Some favor linear processes and detailed evaluation. Others support iteration, collaboration, or task-focused outcomes. These models serve classrooms, higher education, corporate training, and digital learning environments. Their effectiveness depends on how designers apply them, the resources available, and the demands of instruction.
The following table compares selected instructional design models, summarizing where they work best, what they offer, and the limitations to anticipate.
INSTRUCTIONAL DESIGN MODELS COMPARISON
| ID Model | Strengths | Weaknesses | When to Best Apply |
|---|---|---|---|
| ADDIE | Simple, flexible, widely recognized; clear phase structure | Linear model; may not reflect real-time iteration | Ideal for structured training development and beginner instructional designers |
| Dick, Carey, and Carey | Systematic and detailed; includes strong evaluation focus | Time-consuming; can be rigid for smaller or agile projects | Best for complex instructional development in business, military, and education sectors |
| Smith and Ragan | Emphasizes instructional strategies; rooted in cognitive theory | May appear highly linear; implementation guidance is brief | Strong fit for projects needing strategy development and cognitive task alignment |
| Merrill (Pebble in the Pond) | Task-centered and based on first principles of instruction; highly learner-focused | Conceptual; less operational detail; needs integration with other models | Best for real-world, authentic tasks requiring skill demonstration and integration |
| IPISD | Highly detailed; designed for performance-based training; clear control mechanisms | Rigid, complex, military-specific origin limits flexibility | Best for military training or large-scale projects requiring strict control and documentation |
| Gentry | Balanced development and support components; great for large-scale systems | May seem mechanistic; relies on detailed implementation tools | Useful in higher education or managing broad instructional systems |
| Agile Model | Empirical, iterative, and user-participatory; fast prototyping | Lacks detailed guidance on design stages; assumes team collaboration | Effective in fast-paced, collaborative environments like software-based or corporate eLearning |
| Seels and Glasgow | Integrates project management and diffusion; iterative feedback loop | Requires strong project management knowledge | Well-suited for long-term projects requiring scalability and stakeholder buy-in |
| Morrison, Ross, Kalman, Kemp | Non-linear and flexible; comprehensive with support resource planning | Can overwhelm novice designers due to its breadth | Best for instructional designers managing both design and delivery in dynamic educational settings |
| ASSURE | Emphasizes media use and learner characteristics | Less guidance for complex course development | Best for lesson-level planning and media-rich environments |
| PIE Model | Focuses on practical instructional delivery with technology integration | Limited detail in design and development phases | Ideal for classroom settings using tech and structured media planning |
| Dorsey, Goodrum, Schwen | Strong user involvement; rapid cycles improve usability | Conceptual and lacks detailed implementation process | Great for rapid prototyping in digital product or course development |
| Diamond | Curriculum-level planning with institutional alignment | Focused on higher ed; less detail for day-to-day teaching | Best for program-wide instructional design in universities or colleges |
| PDPIE Framework | Clear instructional design flow tailored to online course development | Emphasizes ID role only; limited collaboration guidance | Best for instructional designers building asynchronous or blended online courses |
| RID (Rapid Instructional Design) | Accelerated, learner-centered; emphasizes reflection, practice, and job application | Lacks depth in analysis, development, and evaluation; not standalone | Ideal for plug-in to ISD for fast-paced training needing strong engagement and practical focus |
| HPT (Human Performance Technology) | Grounded in systems thinking; focuses on actual vs. desired performance; flexible interventions | Not exclusively instructional; may be complex to implement across organizations | Best for solving performance issues with a mix of instructional and non-instructional interventions |
Table 2: Comparison of ID models
Assessment functions as an indispensable mechanism for diagnosing readiness, guiding instruction, certifying achievement, and encouraging learner autonomy. Each type responds to a specific instructional need; some prioritize pre-instruction insights, while others focus on feedback loops, summative judgments, or student self-monitoring. The design and application of each depend on context, available resources, and instructional goals. Though the tools and techniques may differ, the intent remains consistent: to support purposeful, responsive learning.
The table below outlines a comparative view of the assessment type
COMPARISON OF ASSESSMENT TYPES
| Type of Assessment | Key Features | Purpose | Strengths | Weaknesses | When to Best Apply | Examples |
|---|---|---|---|---|---|---|
| Diagnostic Assessment | Pre-assessment to gauge prior knowledge, skills, misconceptions | Identify strengths and weaknesses before instruction | Helps tailor instruction; reveals misconceptions | Not used for grading; time-consuming | Beginning of unit or course | Pre-tests, self-assessments, interviews |
| Formative Assessment | Ongoing feedback during learning; not graded | Support learning and guide instructional adjustments | Encourages active learning; low-stakes feedback | Difficult to track across time; may lack rigor | Throughout instruction; during tasks | Journals, quizzes, peer feedback |
| Summative Assessment | Culminating assessment; evaluates what students learned | Measure achievement at the end of instruction | Clear accountability; easy to grade and communicate | Stresses memorization; limited feedback | End of a unit or semester | Final exams, projects, performance tasks |
| Authentic Assessment | Real-world tasks; evaluated with rubrics; active learning | Demonstrate application of skills in real-world scenarios | Engages students; aligns with real-world tasks | Hard to standardize; grading may be subjective | Project-based or task-oriented instruction | Case studies, design prototypes, simulations |
| Assessment of Learning | Used to rank and grade; test-based; typically summative | Certify achievement and communicate progress | Widely recognized; structured data collection | Often emphasizes competition; neglects improvement | Final assessments and report cards | Standardized tests, final grades, transcripts |
| Assessment for Learning | Teacher modifies teaching based on feedback; focus on progress | Guide learning decisions and next steps | Personalized feedback; improves engagement | Demands teacher adaptability; complex to implement | During instructional units to guide teaching | Checklists, feedback forms, annotated drafts |
| Assessment as Learning | Student-driven; self-assessment; reflection and self-monitoring | Foster metacognition and self-regulated learning | Empowers student ownership of learning | Requires maturity; not all students self-regulate | During learning to build self-assessment skills | Reflection logs, learning goals, peer evaluations |
| Digital Assessment | Technology-enabled; multiple attempts; real-time feedback | Enhance learning, access, and efficiency using tech | Immediate feedback; flexibility; customizable pathways | Tech issues, equity gaps, limited tactile/human connection | Online or blended courses; scalable settings | Online quizzes, LMS rubrics, video responses |
| Renewable Assessment | Creates lasting value; used beyond classroom context | Encourage critical thinking and relevance beyond school | Authentic, valuable artifacts; boosts motivation | Time-intensive; demands strong design and mentorship | Capstone projects, service learning, or internships | Client presentations, public wikis, outreach materials |
Table 3: Comparison of assessment types
reflection
Instructional design models rely on learning and instructional theories. These theories explain why each design decision exists. They guide content flow. They guide assessment choices. Without them, design decisions lose coherence.
When I taught in a private school, norm-referenced grading controlled academic recognition. Teachers checked alignment with the Most Essential Learning Competencies. Ranking still dominated classroom culture. Students waited for grades. They compared scores. They competed for honors. Ties led to score reviews. Multiple students sometimes shared first honors. High scores carried weight. Those scores often failed to show real skill or readiness.
The K–12 system now uses criterion-referenced assessment. Recognition centers on mastery. Competition no longer defines success. When learners meet the same standards, each receives “With Highest Honors.” This system supports fairness. Doubts remain. PISA results place Filipino learners behind in reading, mathematics, and science. Many link these outcomes to standards-based assessment. That claim oversimplifies the issue.
During pre-service training, an instructor addressed a persistent problem. Learners advance without strong foundations. Her message stayed clear. When a learner enters a class unprepared, responsibility transfers to the current teacher. Blame stalls progress. Collective accountability sustains learning systems.
This principle guides my instructional design work today. I prioritize diagnostic strategies that surface gaps early. I avoid ranking-focused routines that weaken learning value. I design systems that emphasize mastery. I rely on data with restraint. I support interventions that restore learner confidence.
I use digital assessment as a support mechanism. Technology shortens feedback cycles. Digital rubrics, structured submissions, and comment tools improve clarity between teacher and learner. Limitations persist. Academic dishonesty remains common. Connectivity remains unstable. Hands-on performance remains difficult to evaluate online. These conditions influence design decisions, especially in resource-limited settings.
I initially examined blockchain as a transparency tool for digital assessment. Its secure architecture promised traceable submissions and fair grading. Field conditions challenged that direction. Shared devices, unreliable internet, and uneven digital skills reduce feasibility across many Philippine schools.
That realization redirected my work toward instructional design. My proposal, Integrating Blockchain as a Learning Tool for Simplifying Decentralized Technologies in Philippine K–12 Classrooms, treats blockchain as learning content. The study examines how junior high learners engage with core blockchain concepts through clear instruction. It aligns with the national digital literacy framework. It accounts for classroom and institutional constraints.
This direction reflects how I design instruction now. I prioritize clarity and access when introducing emerging technologies. Instructional models guide planning. Reigeluth’s Elaboration Theory supports content sequencing. Keller’s ARCS Model supports motivation. Assessment aligns with these frameworks. Criterion-referenced and digital tools hold value only when they measure learning without technical barriers. I design learning environments where instruction and assessment remain clear, fair, and aligned.
References
21st Century Education Systems, Inc. (n.d.). The ASSURE model. https://21stcenturyeducationinc.weebly.com/the-assure-model.html
Branch, R. M., & Dousay, T. A. (2015). Survey of instructional design models (5th ed.). Association for Educational Communications and Technology.
Center for Educational Technology & Instruction. (2015). Instructional design principles lecture (SAM) [Video]. YouTube. https://www.youtube.com/watch?v=1h2iREzXAxU
Chappell, M. (2018, September 26). Instructional design using the ADDIE model. eLearning Industry. https://elearningindustry.com/addie-model-instructional-design-using
Clark, D. R. (2014). Rapid instructional design (RID). http://www.nwlink.com/~donclark/hrd/learning/id/RID.html
D’Angelo, T., Bunch, J. C., & Thoron, A. (2018). Instructional design using the Dick and Carey systems approach. https://edis.ifas.ufl.edu/publication/WC294
Dessinger, J. C., Moseley, J. L., & Van Tiem, D. M. (2012). Performance improvement/HPT model: Guiding the process. Performance Improvement, 51(3). https://deepblue.lib.umich.edu/bitstream/handle/2027.42/90576/20251_ftp.pdf
Earl, L. (2003). Assessment as learning: Using classroom assessment to maximise student learning. Corwin Press. https://web.uvic.ca/~thopper/iweb09/GillPaul/Site/Assessment_files/Assessment.pdf
Graham, C. R., Slaugh, B., McMurry, A. I., Sorensen, S. D., & Ventura, B. (2024). Assessment plan (Chap. 4). In Blended teaching in higher education: A guide for instructors and course designers. EdTech Books. https://edtechbooks.org/he_blended/chapter_4_assessment_plan
Gregory, R. (2016). The MRK model of instructional design [Video]. YouTube. https://www.youtube.com/watch?v=_LwUQWmWlF0&ab_channel=RhondaGregory
Masters, G. N. (2014). Assessment: Getting to the essence. Centre for Strategic Education. https://research.acer.edu.au/ar_misc/18
McGriff, S. J. (2000). Instructional system design (ISD): Using the ADDIE model. https://www.lib.purdue.edu/sites/default/files/directory/butler38/ADDIE.pdf
Merrill, M. D. (2002). A pebble-in-the-pond model for instructional design. Performance Improvement, 41(7), 39–44. http://www.clarktraining.com/content/articles/PebbleInThePond.pdf
Merrill, M. D. (2002). First principles of instruction. ETR&D, 50(3), 43–59. https://ocw.metu.edu.tr/pluginfile.php/9336/mod_resource/content/1/firstprinciplesbymerrill.pdf
Merrill, M. D. (2008, August 12). Merrill on instructional design [Video]. YouTube. https://www.youtube.com/watch?v=i_TKaO2-jXA
Northern Illinois University Center for Innovative Teaching and Learning. (2012). Formative and summative assessment. In Instructional guide for university faculty and teaching assistants. https://www.niu.edu/citl/resources/guides/instructional-guide
Pappas, C. (2014, December 6). Instructional design models and theories: Elaboration theory. https://elearningindustry.com/instructional-design-models-elaboration-theory
Sharif, A. (2015). PDPIE framework: Online course development quality cycle. In Expanding learning scenarios: Opening out the educational landscape, Proceedings of the European Distance and E-Learning Network Annual Conference 2015. https://www.researchgate.net/publication/276290503_PDPIE_Framework_Online_Course_Development_Quality_Cycle
Subhash, P. D., & Ram, S. (n.d.). Types of assessment and evaluation. https://egyankosh.ac.in/bitstream/123456789/80503/1/Unit-13.pdf
Treser, M. (2015, August–September). Getting to know ADDIE. eLearning Industry.
University of Connecticut. (n.d.). Alternative authentic assessment methods. https://cetl.uconn.edu/resources/teaching-and-learning-assessment/teaching-and-learning-assessment-overview/assessment-design/alternative-authentic-assessment-methods
van Merrienboer. (n.d.). van Merriënboer’s 4C/ID Model. https://drive.google.com/file/d/1PzcSSeOAZnYRjeBvWNp8lnwtUlsoh2R9/view
Watson, E. (n.d.). Defining assessment. https://www.ualberta.ca/centre-for-teaching-and-learning/media-library/resources/assessment/defining-assessment.pdf
Wiggins, G., & McTighe, J. (1998). What is backward design? (Chap. 1). In G. Wiggins & J. McTighe, Understanding by design. Association for Supervision and Curriculum Development. https://educationaltechnology.net/wp-content/uploads/2016/01/backward-design.pdf
Yates, N. (2017, May 27). Kemp’s model of instructional design [Video Playlist]. YouTube. https://www.youtube.com/playlist?list=PL-_oM9qDZ39FcQ1ylVkgIviauYjNHHsss


Leave a Reply