The design institutes of India often face a question – Is there any framework for measuring best practices for design education and quality outcomes of students? The answer to this question is ‘NO’. there are more questions around the previous one, which is – how do you measure the design qualities and learning outcomes? What is the best practice(s) in design education? What is the quality of design education? What is the standard of design colleges in India?
To answer these questions, as a quick solution we have a recently launched – Design Institute Evaluation Framework (DIEF). Drona Seeker pleasantly hosts the Design Institute Evaluation Framework (DIEF) to help design institutes, maintain the minimum standard of Design Education in India. the DIEF is also established with the goal of nation building to support and promotions of Make-in-India, Start-up India and New Education Policy of India. According to this framework, a design institute can be evaluated based on the following measures –
Obtained Score Grade Impression
91-100% D +++ Excellent Design School
81-90% D++ Very Good Design School
71-80% D+ Good Design School
61-70% D Average Design School (many interventions required)
<60% D- Poor Design School
The OBLP refers to the measurement of the attainment of design learning according to learning objectives as a result of best practices in design mentoring. To measure the OBLP we need to map programme outcomes of a design programme with course outcomes of every course under a particular design programme. After that faculty or programme designers should prepare course contents and map assignments or theoretical test questions. Then, we need to set a benchmark for the outcome of a particular course in terms of skill/knowledge should gain by students e.g. 50% of students in a batch should score more than 50% in a class according to a fixed set of suitable rubrics to assess the design students. After the actual assessment of a course a faculty should calculate the attainment and crosscheck the deviation from the preset benchmark on the basis of obtained marks by students opting for a particular course. A faculty or programme designer should also document the best mentoring pedagogy (e.g. gamified learning, experiential learning etc.) applied during a particular course. The OBLP should be calculated for all the courses under a program and then, the cumulative reflections should be monitored by an expert team of design educators. The teacher-student ratio should not exceed 1:15. Otherwise get less marks in this section.
# Quantitative (40%) and Qualitative Measures (60%)
The IDE is based on the kind of programme offered by a design school. For instance, the product design undergraduate (UG) programme requires Mechanical Workshop facilities, Digital Drawing Studio, CAD Modelling studio, Clay Modeling Studio, Painting Studio, etc. whereas, the user experience design undergraduate programme requires – a computer lab with digital prototyping software and visual design software, immersive experience (AR/VR) studio, tangible interface design lab, human factors, and usability testing lab, etc. Safety guidelines and software or tools user guidelines should be displayed in all labs, studios, and workshops. All facilities should be developed as per the student’s strengths. The computer-student ratio should not be greater than 1:10. All classrooms should be ICT enabled. The design school’s physical infrastructure should be designed so that it is perceived as a design habitat.
# Quantitative (20%) and Qualitative Measures (80%)
Research activities of a design school should be presented for the last 5 years. The DROs are listed below –
# Quantitative (80%) and Qualitative Measures (20%)
The DCSC practices are depending on the following parameters –
# Quantitative (80%) and Qualitative Measures (20%)
The PR score depends on –
# Quantitative (100%) and Qualitative Measures (0%)
Every institute should conduct this programme and validate all assessment results by calling external experts. You can also create a team and self-assess your design school and then validate the assessment with external design experts. The internal or external assessment team may be constituted as follows –
***Experts should crosscheck evidence for all produced data.
At least 3 experts should assign scores among these them. The final assessment score would be calculated on the basis of 60 % score from external experts 30% score from internal experts.
Every year Drona Seeker arranges the DIEF compliance check and sends a team of experts who visits Design Institutes/Schools which participate in DIEF for the compliance check.
The participating institute only needs to arrange the travel, food, and lodging for the team of external experts (provided and suggested by DRCI) for the purpose of a compliance audit.
We are thankful to Dr. Anirban Chowdhury (Ph.D. in Design from IIT-Guwahati) for his efforts to voluntarily creating the draft of Design Institute Evaluation Framework (DIEF)
Please write to Hony. President of Design Research Council of India (DRCI) at hony.president@drci.in
Copyright © 2024 Design Research Council of India (DRCI ReseaRch Community LLP) - All Rights Reserved.
Powered by GoDaddy
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.