New Internal Assessment - a Comparison
The Internal Assessment is changing to an Exploration:
first teaching September 2019
The below, and linked pages, aims to help Studies teachers understand the similarities and differences between the current Studies 'Projects' and the current (to be updated once the 'new exploration criteria' are released) SL/HL exploration criteria.
Both the Applications and interpretation and Analysis and approaches explorations, will be marked using the same criteria, for SL and HL.
The only difference in the criteria for SL and HL is on criterion E: Use of Mathematics. Below (towards the end of this page), we have carefully marked, and given a lot of thought to, five current Studies 'Projects' using the existing SL/HL exploration criteria (they are not 'official' IB marks).
Marking existing Projects using the Exploration criteria
Comparing the New with the Old . . .
Here are the marks given when the same internal assessments were marked, on this site (they are not 'official' IB marks), using the current Studies 'Project' criteria:
'Studies' Projects criteria marks
|Human Development - Investigating the human development index||19/20|
|Gun Crime - Gun crime, ownership and the Brandt Line||19/20|
|GDP and Fertility - Comparing African and European countries||15/20|
|Aircraft - Investigating an Aircrafts useful load||14/20|
|Popular Lyrics - Which words are used the most in songs?||10/20|
Comparison Table: Studies Project Criterion and 'New' Exploration
When the SL/HL internal assessment changed from the portfolio Type I (pure mathematics investigation) and Type II (modelling) format (the 'modelling', Type II tasks are useful resources in preparation for the HL paper 3) to the current 'Exploratoin' model examiners, when marking, were advised to put out of their minds the old criteria and focus on the new ones (without reference to the old). I found this very difficult to do to begin with, but actually, it was good advice. However, when the change first comes in it is helpful as a first "entry point" to try and see what, if any, comparisons there might between the Studies Project criteria and the 'new' exploration criteria. Below is a table that aims to make some of the points of comparison clear and concise.
Can Studies Statistical Projects make good 'Explorations'?
The short answer is: "Yes, they can". The April 2017 Curriculum Review Report report states (p.6-7) :
report states (p.6-7) : "A trial was carried out in August 2016 to test the assessment criteria to ensure fairness and comparability between the types of explorations that students are likely to produce. It also indicated that students who produce a Mathematical studies SL type “project” would not be penalized using the proposed criteria.
The new criteria for the Applications and interpretations subject have now been released (as of Feb 2019). It certainly seems true, using the existing SL/HL exploration criteria, that mathematical studies projects will continue to be totally acceptable and capable of achieving high marks. However, the criteria for the current Studies Project and Exploration differ significantly. This suggests that whilst Studies 'style' statistical projects will make good explorations, the way in which teachers prepare their students, and the content, will need (probably) to adapt to better meet the new criteria and syllabus.
Below, we offer some 'Top Tips' on the modifications that should help students writing statistical explorations better align their work with the aims and expectations of the 'Exploration' criteria. The IB's TSM materials, published (since Feb 2019), will be the first point of reference in gaining a better understanding on these issues.
Writing a Studies style project with the Exploration criteria, and new SL
syllabus, in mind . . .
A - Presentation
Social science, Geography etc. questions are often very complex i.e. there are numerous contributing factors that influence key issues such as e.g. income, education level, pollution level etc. The misuse of statistics in this area can be very damaging e.g. countries aligning their economic policy based on unemployment figures when unemployment measures are not measured the same way from one country to the next, the minimum wage varies/doesn’t exist, social security systems diverge, the cost of living is not the same, tax avoidance rates are higher/lower etc. etc. It is important, to ensure concision and coherence, that the aim is tightly focused.
B - Mathematical communication
Consideration of the appropriate degree of accuracy including the use of the “approximately equal to” notation: ≈ is taken into consideration when awarding marks for this criterion (as mentioned in the TSM material, but not directly in the criteria or notes). Examples of common errors are:
(i) predictions using a greater degree of accuracy than the measures provided/data provided.
(ii) no use of the ‘approximately equal to’ notation for results/predictions obtained using a ‘model’ e.g. a function that models (and therefore ‘estimates’) a set of data, shape of a building etc.
(iii) poorly labelled graphs (their axes in particular) and diagrams are mentioned for the exploration under criterion B across a number of SL subject reports, whereas these are not cited in the Project subject reports (the definition of variables, the correct use of notation and the accurate and precise use of terminology are cited in both Project criterion G and the Exploration criterion B).
There is no “Information/Measurement” criterion for the exploration. In the current Studies criteria there are three marks available for data collection and measurement. In the exploration criterion, the quality of the data used/collected is considered under criterion C, personal engagement, and perhaps, to some extent, under criterion D, but will at most, likely make a difference of only one or two marks.
D - Reflection
It is good practise to exercise caution when drawing inferences and conclusions from statistics e.g. correlation doesn’t imply causation, statistics provide differing degrees of evidence for different positions, but can never ‘prove’ (mathematically) a social science theory. Although some of these points are not explicit in the attainment descriptors, the habit of exercising caution helps to foster greater reflection [and focus (criterion A) i.e. what questions can and can’t be addressed, realistically, given the page (and time) limitations of the ‘exploration’].
Neither ‘critical’ reflection nor ‘substantial evidence’ really form part of the Project’s criterion D descriptors. It is therefore to be expected that current Projects, written with the Project criterion in mind, will not reach achievement level 3 using the exploration criteria (but will be able to, once this is included in the student's preparation for the internal assessment).
E – Use of mathematics
The syllabus has changed. This seems an obvious point to make, but is particularly significant for items that are in the existing Studies syllabus, but where a greater degree of understanding or further application are now expected e.g. the conditions under which the use of Pearson’s correlation coefficient is justified, consideration of outliers are explicitly ‘not required’ in the studies syllabus, but are included in the current SL and HL.
In a statistical/modelling exploration, if the data, when plotted on a scatter graph, looks non-linear, it is expected that the student would not use ‘r’ nor linear regression. They would be expected to find an appropriate non-linear function from the syllabus (or beyond/commensurate with it) to model the data. This is a significant departure from the project where there is not an expectation on student’s to find an appropriate non-linear model using their functions knowledge.
If the student neither considers that ‘r’ is inappropriate, nor attempts to model the data using a non-linear function, then "limited" knowledge and understanding of the modelling process, rather than "some" or “good”, is likely the better fit descriptor.
Given the ‘applications’ and ‘interpretations’ focus of the new subject, with an emphasis on the role of technology in solving mathematical and scientific problems, it is unlikely that students will be required to calculate, manually, values of r in their internal assessment. The emphasis on understanding in criterion E is more likely to be satisfied by students showing they understand the limitations of Pearson’s correlation coefficient i.e. in particular, the effects of outliers on the validity of pearsons and the assumption of linearity etc., under what circumstances it is, and is not, appropriate to use ‘r’.
The careful interpretation of the value of ‘r’ can further help to evaluate the student’s level of understanding.
Under the current Studies projects, unlike for the current SL/HL explorations, there was no expectation of students having to develop alternative, non-linear models analytically, using their functions knowledge. This is a part of the reason why calculating ‘r’ for non-linear data was given greater leniency, provided the scatter plot of the data didn’t show a very clear, and unequivocal, non-linear relationship. It is more likely that students, given the “modelling” emphasis of the new subject, will be expected, using technology, to work out more appropriate, non-linear models. Evaluation of a sub-optimal model is considered under criterion D: reflection, choosing an entirely inappropriate model to begin with suggests a lack of understanding of the different functions, and/or for the modeling process itself.
The differences mentioned around the treatment of linear and non-linear data help explain the more significant changes in the marks of some ‘projects' using the new, ‘exploration’ criteria e.g. HDI and Gun Crime.
The IB Teacher Support Materials (TSM) advise that: “It is important that relevant background information . . is included with the sample”. Below are some keypoints to be covered in this information:
Which syllabus items/units had been covered in class prior to the students writing their explorations. If a student has had to teach themselves some of the mathematics used in the exploration, and has gained at least a ‘limited’ understanding of it, this will be taken into account under criterion C, personal engagement.
What access to technology and mathematics/science software do the students have? Which software has been used in class with students? Again, if a student has used a piece of software effectively that they had to go away and learn how to use by themselves, this will be considered under criterion C. It will also help the moderator take into consideration the possible slight imprecision of some graphs that have been drawn by hand, rather than using graphing software, if the school/country is in a situation where access to such software was difficult or limited.
Other comments can be included, if the teacher thinks they will be helpful for the moderator. However, marks can only be awarded based on evidence within the exploration itself, that correspond to a given attainment level descriptor. As stated in the guide’s internal assessment section, criterion C: “There must be evidence of personal engagement demonstrated in the student’s work. It is not sufficient that a teacher comments that a student was highly engaged . . . criterion C assesses the extent to which the student engages with the topic by exploring the mathematics and making it their own. It is not a measure of effort”.