
doi: 10.5772/13320
Higher Education targets to develop complex theoretical, abstract and analytical reasoning capabilities in the alumni. This objective can be accomplished addressing four major steps: Theoretical foundation, practice, communication and assessment (Petry, 2002). Theoretical background and practical exercise comprise the basic knowledge building process at initial stages. Assessment guides the alumni through higher complexity studies permitting the student to identify the weak points in the knowledge building where further theory study and/or practice is required. Theoretical foundation and problem-solving practice are well known aspects in Higher Education. High-quality materials in printed and electronic format, sometimes including multimedia –audio, video or computer graphics–, sometimes delivered through the Internet are readily available today. Teaching-aids as computer-feed overhead projectors or electronic blackboards in the classroom are common place and facilitate the knowledgebuilding process. Moreover, computers in the classroom are a powerful tool linking theory and problem-solving practice in engineering studies (Beyerlein et al., 1993). On the other hand, the assessment process has been evolving slowly in the last decades. Pen-and-paper examination techniques have been translated to the computer-enabled classroom as software applications that present an exam in the screen and record the student answers. This process can be seen as an external assessment targeting to evaluate the skills of the student in order to give a pass/fail on a given subject. The external evaluation can be useful for the student in order to know the skill level, but usually fails short when the student wants to now “what’s wrong”, i.e. to know not only what question was missed but also what knowledge areas the student is finding difficulties. Weak areas identification cannot be done from a single question or set of questions. It requires the assessment process, examination or similar, to be considered as a whole. Student attention time or time spent thinking on a specific question –relative to the other question, as some students think faster than othersclearly indicates the areas where difficulties hide, by example. The pattern followed when answering the exam questions, by example, is another useful indicator. The student will try to answer first the questions he feels more comfortable with. Dubitation on the answer –change the answer several times– is another useful parameter. All these parameters and many others can be compiled and processed by the unprecedented analytic processing capabilities of modern data mining techniques. Collaborative assessment appears as a the natural evolution from the individual learning to the collaborative learning, which was proposed as a suitable technique to speed-up development of analytical reasoning as different approaches are continuously suggested
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 6 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
