Student Interview Report

Transcription

Student Interview Report
VICTORIOUS PROJECT:
STUDENT INTERVIEWS & PILOT TWO
Recording, analysing and classifying
students’ levels of satisfaction
vis-à-vis virtual mobility
Index
Introduction
Part I: Student Interviews: their nature and function in the Victorious Project
Part II: Culture shock: moving from high to low or vice versa in digital services
Conclusions
1
1. Introduction
What experiences of virtual mobility do European students have and how do they view these
experiences? Are they positive or negative? Or are they mixed and subject to variation over a period
of time? How do Universities take into account and plan for the transitions from one service culture
to another? How can transitions, associated culture shocks and attempts by Universities to assist
students during such transitions be accurately measured?
This study attempts to reconstruct student’s pathways through virtual mobility experiences
and degrees of satisfaction in a visual form that allows comparisons to be made and conclusions to
be drawn. It is concerned with creating analytical tools and carrying out explorations rather than with
providing definitive results. Even so, the study contributes to the final checklists (see Final Report)
by suggesting critical considerations that are ultimately linked to the promotion and development of
the personality and awareness of individual students experiencing virtual mobility in the context of
real mobility. Nevertheless, the focus in this report is on the development of tools related to the basic
concerns of the Victorious Project’s study of the current implementation of virtual mobility: Quality,
Interoperability/Open Standards, and Digital Repositories and Resources. In this study, these concerns are
viewed from the standpoint of ease-of-access, friendliness, simplicity of use and general motivation and
encouragement that individual Universities provide to students through their service offering in the field
of virtual mobility. The report is divided into two halves. Part I is dedicated to student interviews,
Part II to Pilot 2 which is concerned with analysing and classifying the culture shock associated with
transitions from one University to another. Though the two parts can be read separately, Part I is
instrumental in making Part II possible, as a close reading makes clear.
In the early stages of the Victorious project, it became clear that the tools of data collection
and interpretation needed to be discussed and researched in order to underpin those aspects of the
project’s goals that are linked to attitudes, judgements and interpersonal relations. These are hard to
measure, in particular when using traditional tools such as questionnaires which tend, if anything, to
hide rather than reveal students’ real feelings. Two specific objectives needed to be explored:
1) The development of methods of data collection: specifically with the goal of encouraging
students to describe their encounters with service systems in the Universities of the country
they were visiting and to specify how the various aspects of the service delivery system
measured up to their expectations and previous experiences;
2) The development of a system of classification of experiences that defines and measures
the culture shocks that students described as occurring in their transitions from one
University to another.
These objectives were thus respectively concerned with developing research tools for data
gathering and their subsequent analysis and classification in systematic terms. Though we were
dealing with experimental data gathered from a limited number of Universities, the focus remained
on defining methods and a classificational system applicable in a wider framework. Hence the
applicability of this research to the implementation of virtual mobility on a large scale ultimately the
major driving force in the rationale of the Victorious project.
With regard to these objectives, it may be noted that research projects typically lie between
two extremes forming a cline. At one end of the cline are research projects with a well-defined initial
framework in which data gathering and interpretation methods rely on “standard”. methods, in
particular well-researched types of questionnaires and interviews. Such projects, are, by and large,
typically surveys concerned with assessing the distribution of data within a pre-established framework
and adopt ready-made methods that remain fixed throughout the project. At the opposite end, are
research projects which define and firm up on their framework of reference over a period of time on
the basis of initial working hypotheses. A major objective in such research is to work on a constantlyevolving model of investigation until a stage is reached where the model both elicits and interprets
data in convincing ways. Such projects may require greater lead-in times, as they are initially
explorative but firm up on their overall structure and findings as they proceed, thus providing greater
all-round credibility.
One of the Victorious project’s major goals – providing stakeholders with good-practice
checklists that inform their service culture – helps Universities to be more objective about the way
their services match up to expectations. The goal of providing checklists can only be achieved by
2
constant reflection on how revealing data can be obtained that provides an accurate snapshot of
virtual mobility, as it stands today in European Universities, and gives clues to probable and desirable
future scenarios.
Part I of the study presented below thus summarises the exploratory research within the
Victorious project using filmed interviews of students talking to interviewers. These interviews form a
corpus of data providing valuable insights into student experiences of virtual mobility within their
more general experience of real mobility and are available on the BSCW. These experiences provide
concrete evidence of both good and bad practices; in other words, the experiences recounted by the
students interviewed shed considerable light on critical points where mismatches arose between
Universities’ service delivery culture, its actual implementation and students’ actual needs, desires and
expectations. This part of the research attempts to define ways of recording specific experiences that
pinpoint specific aspects of Universities’ service culture and service delivery systems that have
apparently been overlooked, underestimated or deliberately ignored. Without an integrated and
experimental approach to data gathering and its interpretation many of the tell-tale details and
moments of truth would certainly have been missed.
Part II is designed to provide analyse and systematise these experiences in ways which, as
documented below, could inform and guide other parts of the project. Again on an experimental
basis it attempts to relate methodology to the very special requirements of virtual mobility.
In its brief summary, the report makes explicit reference to a total of 14 European
Universities in the following countries: Estonia, Finland, France, Germany, Holland, Italy, Spain and
the UK. None of the Universities is named but where appropriate indication is given as to the
country in which the University is located.
3
Part I: Student Interviews: Their nature and function in the Victorious Project
1. Introduction: research tools
Any approach to data collection which gets close to students real feelings about their experiences is
very much the name-of-the-game. We did not create a call centre which called up students in the
middle of the night to ask them about their virtual mobility experiences. Nor did we create a blog
designed to elicit the data we needed.. But we did come very close to doing so. We phoned Erasmus
students and asked them if they would come and talk to us and whether they would be happy if we
filmed the discussion. In some cases we invited students to come to talk to us in a group and on
others as individuals. In other words, we varied the tactics. We used telephonists and interviewers
who were only one or two years older than the students. Our “staff” had recently undergone similar
real and virtual mobility experiences. We got them to talk to the students in a way which empathised
with them. We did not use old-fogey interviewers such as University professors since the students would
have refused to co-operate. Our front office was different. Students generally agreed with our
approach.
Contact between information seekers and respondents naturally varied from one university
to another for a variety of reasons. We deliberately varied the approach. One variant involved an
interviewer filling in a questionnaire either in a face-to-face interview or through a telephone
interview. Ultimately, our approach was designed to interview students within a structured interview that
allowed them to recount their own experiences from their point of view. Our staff were trained to
use a written questionnaire but to ask the questions in an order that followed and stimulated the flow
of conversation. We discussed with our interviewers the need to elicit information in an indirect way
as possible. For example, anecdotal evidence strongly suggests that the classic instrument of virtual
mobility, the webpage, is not in fact the prime focus for the current generation of mobile students
who seem to be privileging other technologies including, for example, the mobile phone. We chose
not to ask this question directly but to let it emerge naturally from the discussion.
2. Towards the video questionnaire: integrating questionnaires, interviews and video
recordings in the gathering and interpretation of data.
Even with this modification of the basic questionnaire there are many cases in which vital data is
missed because the researchers who drew up the questionnaire in the first place did so on the basis of
their previous experiences which however vast, cannot hope to capture the wealth of detail that
actually arises. Factors such as age differences etc. add to this situation. We were working towards the
video questionnaire and our task would have been much easier if this had been in place from the
outset. The term “video questionnaire” is not a new concept. It has been used, for example, in an
experimental way for diagnostic purposes in medical diagnoses, where visual elements − such as
diagrams, charts and occasionally video clips − are used to better contextualise a written question and
guide respondents’ answers. Figure 1 is a mock-up of a small part of such a video questionnaire that
could be used in projects.
Section 2: Laboratory facilities.
First click on the player button and listen to a
student recalling her experiences.
Then answer the questions given below.
Question 1: Was your experience of services in the University
where you were a guest broadly similar to those
described in the interview?
YES □ NO
□
Question 2: Which of the following were better:
a) opening times □
b) quality of assistance □
Player symbols to be introduced
c) bookings for equipment □□
4
Figure 1: A mock-up of an online video questionnaire
A video clip integrated into an online questionnaire facilitates and encourages compilation of the
questionnaire in question. The video questionnaire improves on traditional questionnaires as a means
for data collection and interpretation. It prompts respondents to recollect and share their own
experiences. It can be used in a number of interactional contexts, including interactive online
websites designed to gather data. As such, by promoting a significant interpersonal element which
offsets the rather tedious nature of question-and-answer formats, it introduces a competitive edge for
researchers eager to provide a yardstick through which to describe the differences and similarities in
shared experiences of the students who are currently attending European universities.
Though strictly not a Victorious deliverable, the video questionnaire is both a tool and a goal
towards which to work. Many intermediate stages, indicated in Table 1 have proved to be particularly
valuable in the Victorious Project. We may spell out these out.
3. 2. Interpreting the data
A comparison of a even a small set of interviews recorded and transcribed brings out a wealth of data
which needs to undergo the stages of analysis described above before it becomes useful. What
follows (Section 3.1) is an analysis of the methods and stages adopted in one of the Universities
participating in the Victorious project followed (Section 3.2) by examples of the classification into types:
different types of interview which seem intuitively to go hand-in-hand with different types of
students and student experiences. It follows that the description in Part II of types of culture shock,
a system of measurement for student experiences of virtual mobility, emerged as a result of the close
analysis of the interviews recorded by the Victorious partners.
3.1 Methods adopted in University A
One example from among the many is thus in order. Three students were interviewed at University
A, in the UK. Two were Erasmus students from other countries, currently spending time at this
University. The third was a UK student at University A, studying in France during the 2004/5
academic year. Questions based on the set of questions in Appendix 1 were used and the interviews
were videorecorded. Two different interviews were undertaken, one where the two incoming
Erasmus students interviewed each other (i. e. Table 1, Type 1); the second where the student
returning from France was interviewed by a staff member (i. e. Table 1, Type 2). The interviews
lasted approximately 15 minutes each and were then selectively transcribed to account for all
questions in the interview and on the questionnaire. This was then analysed along with accompanying
notes.
The following key shows the coding used in this document:
Key:
Student A: Female Erasmus student from University D in Holland.
Student B: Male Erasmus student from University B1 in Italy
Student C: Male UK student, on Erasmus in University P in France last year, now back in
University A in the UK
3. 2 Emerging Themes
The following summary provides an overview of the findings from these interviews:
• Library services: students did not use some or were not aware of all online services, e.g.
Electronic journals, databases. We would need to find out why though. Did not ask enough.
• PC lab access: too restricted at both University A and University P in France
• Online services: None of the 3 students used all online services. Reasons? Not aware of all,
did not need to, or in case of student in University P in France, could not get access rights –
see below
• Password problems: these were cited by 2 of the students, although these were sorted out
5
•
•
•
•
•
•
Availability of online materials: quite good in University A. Variable in University P in
France.
Laptops: they all own one. 2 students brought them with them, the third is going to bring
his. Found them useful, although Student C could not use his at University P in France Uni
campus
Students did not generally use their home file store – can’t get much music on it! This is
probably a general trend in student access due to memory sticks /ipods and other easily
transportable media.
Major problem of course incompatibility for student C due to different requirements of
University A (14 courses required), coupled with an inflexible approach to course
registration (could only be registered on one year). Law faculty was welcoming and used to
taking many ERASMUS students, but IT access for course materials was denied him due to
his mix of courses. He had to get first years to print out materials for him!
Stand-alone PCs in University P in France - very good facility for checking emails quickly
Online services seem more advanced in University A than University P in France
6
Table 1: Interview Type 1: Newly arrived Erasmus students at University A (Students “A” and “B”) being interviewed by a staff member of University A
Q. 1
Q. 2
Q. 3
Home/Host Universities: Home: A University D (Netherlands); B University B1 Italy: Host: University A (UK)
Main subjects studied: A Business, Economics, some languages (Spanish), some Law. B Physics
Length of stay: A 4 months B One academic year
Q. 4/5
How many courses are you planning to take?
A 6 courses at MA level. No exams. She was only allowed to take courses from one faculty at University A
B 5 courses at undergraduate level, (similar to course at Home University.) but will only take 3 or 4 exams.
Q. 6
Are you aware of language courses? A Yes. Taking Spanish. B Was not aware of which English courses available.
Q. 8/9
Online services and resources used at Host University:
A uses course information and lecture notes. Some lecturers’ information is available online – e.g. their availability. Did not use test because she did not need to.
B uses online course materials such as departmental handouts, course materials, some of the tests and exams.
Q. 10
Reasons for not using online materials:
A does not use some software, as the students were not encouraged to do so. She didn’t want to be the only one!
B does not use some online materials, as he does not need them at the moment.
Q. 11
Library services and resources:
A uses the library catalogue and online journals: uses many articles for her essays. Borrows few books: so no use of online library transactions. No idea about bibliographic databases.
B uses the catalogue to find books. He did not know that electronic journals, online transactions or bibliographic databases were available however.
Q. 12
Use of computer services and resources at University A:
A uses most University computer services; no use of assisted loan to purchase of equipment: as she brought her own laptop with her. Aware of help and support, but no occasion to use it.
B uses PCs in labs, file space, email and printing. He is aware of the other services.
Q. 13
Paying for services:
A & B They have only had to pay for printing.
Q. 14
Problems using computing services:
A had a small problem with printing at first, as there were not enough PCs free. She finds the facts that only one lab is open 24 hours and that the door security code changes weekly are
difficult.
B also finds opening hours a problem. He does not have his own PC with him, so has to use labs. Only one library is open late to his knowledge, too. He has had no other problems though.
Q. 15/16 Own PCs or laptops
A brought her own laptop with her – she connects it to the University wireless system. Reliability of this is not always good. She is staying in private accommodation.
B will probably bring his laptop with him after Christmas. He has Internet access in the hall of residence, so he would be able to use it.
Q. 17
Have you had access to help and support?
A needed help at the beginning with her password.
B has not used it, has had no problems.
Q.18/19 Which digital services can you still access at your Home University, while in your Host University?
A can access databases, learning materials and the VLE, but not her file space. This has caused no significant problems, as she put all the files she needed in her email account.
B can still access Bologna library materials, email, learning materials and his student records but he cannot access personal file space. He has had no problems yet, but it is still early.
Q. 20
Things which have worked well so far in accessing and using digital services at University A:
A Likes the Mulberry email service as it is mail-based and good for group work. Is more efficient and therefore this encourages people to use it. Uses Silkymail at home which is Web-based and
less easy. She likes the distributed PC labs. In Home University there are 2 large labs instead.
B It’s good to have a discounted computer support and access structure. Printers are available in one support place.
Supplementary questions:
SQ1
Do you find it a problem having to use resources in English?
A No – all computer systems at Home University are in English anyway! They also have the Blackboard VLE there, just as University A has, which is easy.
B No problem with English. Home University also has Blackboard, although it does not always work well.
SQ2
Have you heard of VPN (Virtual Personal Network) or e-portfolios?
A & B Not heard of VPN, but both liked the idea. But personal server space is small at Home University, so it could be difficult. Haven’t heard of e-portfolios.
7
Table 2: Video Interview Type 2: Ex-Erasmus Student visiting University P (France) now back in University A (UK)
Q. 1
Q. 2
Q. 3
Q. 4
Q. 5
Q. 6
Q. 7
Q. 8/9
Q. 10
Q. 11
Q. 12
Q. 13
Q. 14
Q.15/
16
Q.
17/18
Q. 19
Q. 21
Q. 22
Home/Host University: University A (UK)/University P (France)
Main subjects normally studied at University A: Law and French
Your host University: University P in France
How long did you stay? 10 months, i.e. an academic year
What did you study there? Followed courses in the Law Faculty
How many courses did you take? Had to do 14 (requirement from University A) – they “ask for a lot of work”.
Did they give credits: Yes, they counted towards his final degree
Language courses offered? Yes but costly. Letter sent a couple of weeks before course began, offered French language courses. Having already studied French in University A he felt no need.
Main subjects studied at University P: Law
Did any of the courses you took at your host University offer you online materials or information? Some professors offered online materials on an ad hoc basis. There was an “area” online
where they did this. Provision was very varied. Others did not even have a computer. Some professors could not be contacted by email. Student C knew that course information and lecture notes
were available online. But he could not access them as he was an ERASMUS student. The lecture notes were restricted to first years only and he did not qualify. He was not registered on the first
years’ list but he got round this by getting some first years to print notes out and copy them for him (a good way of integrating with the first years!). He was not just doing lectures with first years,
but with 2nd, 3rd and 4th years, due to the requirements from University A University. He was not aware of exams or tests being online. Marks or grades were sometimes available online for the
French students. Some tutors put them online, some put them on a noticeboard. There was no cohesive approach. ERASMUS students could never access grades or marks online. Student C
mentioned a sort of forum on the University Website, an area for students, but he did not find it very useable. You could not really call it a VLE.
Why you did not use the online services? Student C could not tick some of the categories here, as they were not relevant to him at University P in France.
Library services: He was not sure if any available online, as he did not use them. There were no PCs in the library. Tended to rely on help from the librarians and just used books. This was not
really problematic, as so many course hours consisted of classroom contact time, and he was on campus so much.
Computing services and Resources available: He used PCs in the computer room and had a Host Universitymail account. Once you had got on a PC, there was unrestricted Internet access in
terms of time (access to some Websites was censored by the University). Getting access to actual PCs was difficult – there was a limited number. He was not really aware of filespace. He had an
email account, where he had to give another email address for forwarding. He received email into his normal Internet address. There was no instant messaging. Not aware of wireless access.
There were no loan facilities for assisted purchase of PCs. The Helpdesk was quite a long way from everything else, and nowhere near the Law faculty. He was not aware of print facilities as he
did not print at the University. He emailed material back to himself at home. There was a credit system for printing, but he could not work it out.
Problems accessing PCs There were very few PC rooms, they had erratic opening hours and were all on the 4th floor. One very good facility was stand-alone Internet points in corridors. They
were useful for quick email checking. In the Law faculty, there were not many of these stand-alone points. There were huge queues for them; a few more would have been good. It did mean that
other PCs were available for longer sessions of work. C experienced a few password and login problems, due to being an ERASMUS student. It took a long time to obtain a password and this
was only fixed in the Spring term. In general he felt that the Law faculty was welcoming to ERASMUS students and it took a lot of them each year. However, computing services did not seem to
know how to deal with ERASMUS students. They were not unhelpful, but did not have flexible enough procedures. There was also the problem of University A University requiring its students
to take so many courses, spread across academic years. This did not help with IT integration. (Normally, UK ERASMUS students are only asked to do courses from one academic year. ) One
problem with the stand-alone points was that they could not support all forms of Internet email, eg. gmail and Hotmail. But they were a good idea in theory. He had no language issues, as he
could speak French before he went to University P in France. The teaching was all in French.
Did you take a PC or laptop? Yes, a laptop. He used commercial access. He only used the laptop in his flat. He did take it to the University once, but could not get it to work.
Access to help or support: Once he had found the technicians at the Helpdesk, they were helpful. The problem was the details which University P in France University had for him, eg. where
he was registered, what had been done to his email. He saw the technicians face-to-face and it was possible to contact them by email.
Which digital services at your host University can you still use now that you are back at your home University? He still gets emails from the Host University, mainly as does not know
how to stop them! He thinks he has access to other services, but does not know how to do so. He has not accessed materials for law there, as it is very distinct from the law studied in England.
Materials and databases for French studies at the Host University would be useful, as he is doing units in French history now. He tried to contact the professor by email, but has had no reply.
Which services at your Home University could you still use while at the Host University? Library materials and databases: He used the catalogue to search for details of books, to see “if they
existed”; then looked for these books in France. He used his Home Host Universitymail via Silkymail. He used the Host University’s Law Faculty Website – he looked at resources, to decide
which units to take in the year after ERASMUS. He looked at tutorial plans and lecture plans. He then registered for the following year’s courses online. The best aspect of the British system was
being able to tick a few boxes and send it online, easy, compared to registering in France. Not aware if he could use his personal Home University filestore while at the Host University.
Other comments: Being at the Host University made him appreciate British digital services much more. He was already used to wireless in University A which made “such a difference”. One good thing in Host
were the stand-alone points for checking email, consisting of screen, ball mouse and keyboard which were dotted around the campus, taking the pressure off other PCs. He used them regularly. Overall, he found
digital services lacking but did not need them too much, as the emphasis was on repeating lecture content back, rather than on original research. Research was only encouraged in or after the 4th year.
8
Part 2 – Culture shock: moving from high to low or vice versa in digital services
1. Introduction
From what has been stated above it is clear that there are several transitions that students are likely to undergo when “moving”
between digital contexts in their home and host Universities. These are summarised in Table 3.
Home or host
No or little e-learning
No or little digital library
Digital services & e-learning are in local language
Digital services & e-learning induction & training are good at all
points in degree programme (just in time / as needed)
Use of single VLE only required
VLE is from one ‘supplier’
Host or home
Lot of e-learning
Well-developed and embedded digital library
Digital services & e-learning are in local language
Digital services & e-learning induction & training is poor or only
given at start of degree programme (drop in later problem)
Use of multiple VLEs required
VLE is from a different ‘supplier’
Table 3: A broad summary of potential transitions in digital contexts in home and host Universities
This table is a starting point not an end point. It posits transitions highlighting extreme positions, effectively the end
points of a cline. There are, of course, cases where transitions are enacted as is clear from the examples given in Part I of this
report. The majority of transition are, however, not like this. Most consist of a large number of minor adjustments, some for the
worse, some for the better that tend to even out. “I could print out what I wanted (positive) though I had to pay for it (negative) but
there were no waiting times (positive). The library was a comfortable place in which to work at the computer (positive) though it
wasn’t open in the evenings (negative)”. The service systems described in Table 1 are groupings of subsystems: a deficit perceived
in one subsystem is compensated by a plus value attributed to another subsystem.
The overall picture is made more complex by the fact that the virtual mobility we are seeking to measure is part of a
more general real mobility experience. It is hard to separate the virtual and real mobility components even in an analytical
context. For example, one University may provide free access to good online services to visiting students on a par with those
offered to their own students (roughly speaking the virtual mobility component) but may limit access these services to certain times of the
day and certain days of the week (the real mobility component). On the other hand, another University may provide a lower standard
of quality services with facilities available on a 24-hour basis 7 days a week. Which is the better scenario for mobile students? The
first case slightly penalises them over resident students who usually have better opportunities to find alternatives and are more
expert in doing so (more time, better local knowledge of resources, no language barriers etc.). The second case ostensibly
penalises mobile students less, but there is obviously a critical point (“none of the mouse pointers worked so I gave up”) where
the quality of service is such that mobile students becomes less competitive (“it slowed me down”) as compared with resident
students who, inter alia, are usually more successful and effective in making complaints about service quality than mobile students.
These scenarios are not hypothetical. The first scenario typically occurs where access to online services is through libraries, the
second where Universities (often in decentralised structures such as halls of residence) have invested in low-cost self-access
computer rooms with no assistance and little provision for maintenance of equipment and software).
Whichever way we want to look at transitions between different service cultures and conceptions of virtual mobility,
research into virtual mobility has to be carried out in terms of classificatory systems that take compensatory mechanisms fully
into account. Such classifications may well only be a rough guide insofar as students do not share the same perception or
judgement about the significance of specific aspects of digital services but have the advantage of keeping certain trends in proper
focus. For example, students’ judgements are usually “coloured” in favour of the Host University as opposed to the Home
University with freedom-at- last considerations clearly predominating over I-would-have-been-better-off-staying-at-home judgements about
their overall experiences, digital services included.
All of this scenario raises the following questions: How we go about describing these transitions and more importantly
how do we link them to students’ satisfaction levels? How do we identify and rate Universities efforts to build quality control
systems into their service offering that are designed to reduce the shock of transitions?
As Part I, in answer to these questions this section is divided into a number of steps. These are:
Step 1: Classifying cultural shock
Step 2: Identifying and researching typical “shocks” (Scale 1)
Step 3: “Shock absorbers”: taking Universities’ special correctives into account (Scale 2)
Step 4: Experimentation of the Scales through their application to sample data
A fifth step Step 5: further research is described in the concluding section.
9
Step 1: Classifying cultural shock
In the examples given above, we have identified various transitions that students may and do experience in digital contexts in their
home and host Universities. In this respect, we can establish the following table.
A SERVICE cline
NO/POOR SERVICES
HALF-WAY HOUSE
/ SERVICES IN-BETWEEN
TOP QUALITY SERVICES
and a related CULTURAL SHOCK cline
VERY SHOCKED
AND
DISAPPOINTED
NO CHANGE
VERY SHOCKED BUT
PLEASED
We can illustrate this in more detail as follows.
➻
BAD HOME
Learning: No or little e-learning
➻
Library: No or little digital library
➻
Language: Digital services & e
learning are only in local language
e.g. Ancient Tibetan
➻
Start-ups and refreshers:
No “how-to” start-ups i.e. Digital
services & e-learning induction &
training is inexistent or poor and
where provided, is only given at the
start of the degree programme (drop
in later problem)
➻
Virtual learning environments:
Use of multiple Vles required.
➻
VLE is from one ‘supplier’
IN THE MIDDLE
? Learning: evolving system some
e-learning but patchy
? Library: evolving system
? Language: only some parts of
the services are in English
….others are in local language
for example the library are in
English but the e-learning
courses are not (or vice-versa)
? Start-ups and refreshers:
Some effort is put into but is not
sustained at various points
? Ease of access to e-services
The University mainly uses just
one … but has some services that
require a different VLE
?
10
GOOD HOST
Learning: Lots of e-learning; e-learning
galore
Library: Well-developed and embedded
digital library
Language: Digital services & e-learning
are in plain ENGLISH (i.e. a form of
English that is accessible to the majority
of adult Europeans attending
Universities)
Start-ups and refreshers:
Digital services & e-learning induction
& training are good at all points in
degree programme (just in time / as
needed)
Use of single VLE only required
VLE is from a different ‘supplier’
It was, however, important to have a scale that measured smaller movements Table 4 posits different levels of student
satisfaction on a more articulated scale that helps identify one of the main findings of the project namely that, with some
exceptions, there are few movements from one extreme to the other.
We posit five student satisfaction levels
VERY SATISFIED
STUDENT A
Satisfied
STUDENT B
2
No changes
STUDENT C
3
Dissatisfied
STUDENT D
Very dissatisfied!!
STUDENT E
1
4
5
Table 4: Students’ satisfaction levels
STEP 2: STUDENT TRAJECTORIES ALONG THE CLINE (UP or DOWN)
We can tabulate students possible movements along these clines in the following ways:
11
HIGH
LOW
We can describe students movements up and down the service cline and the cultural shock cline in terms of a system of coloured
rosettes.
There is a special case, dynamic in terms of the student’s trajectory
(from one university’s service offering to another’s) but
static in terms of culture shock.
There are 3 subcategories in this 5th type:
5a) “from the fire into the fire”
5b) “from the frying pan into the frying pan”
5c) “ffrom paradise to paradise”.
12
We can summarise this as follows in terms of student types:
Students A, B, D, E: The student’s position changes both in terms of the service cline and the satisfaction cline. Student C:
The student’s position changes in terms of the service cline but not the satisfaction cline. The student follows a trajectory but
there is no change in his or her level of satisfaction. Student C subcategories a, b, c: This pilot makes little reference to the
subcategories of this special case because the attempt is to measure culture shock. We will thus deal with 5 categories, using the
5th as the “control” case for comparative purposes.
STEP 3: ACCOUNTING SPECIAL CULTURE “SHOCK ABSORBERS”
Increasingly, Universities around the world target special communities, such as religious,
ethnic, gender and other minority groups including the disabled and underprivileged, to
provide special services directed to the members of such communities. This includes special
services for Erasmus students designed to reduce cultural shocks and to encourage
integration into the wider population. How do Universities build in systems that reduce the
shock of transition by explaining what experiences students can expect? Do they use
websites, SMS, traditional or digital solutions to provide the required interactions?
All this presupposes we need to look for information as regards what our Universities (at top or local levels) do to provide
specific support for Erasmus and other visiting students, including for example:
Specific induction programmes;
Assignments to a ‘personal advisor’ on staff who specialises in visiting students;
Peer assignments in a buddy system i.e. assignments to a mentor (local student) who provides orientation and support at
the start or throughout the visit;
Notification (active or passive) to course leaders about the presence of visiting students on their courses and countries of
origin.
This raises the requirement to devise a second scale through which to identify, classify and rate information on what Universities do
to provide specific support for Erasmus and other visiting students. In terms of satisfaction levels, the key point and common
denominator is whether students are treated as individuals and/or members of specific and/or general communities. Accordingly
we have established 5 target levels for services of these kinds.
TARGET LEVEL
SERVICE LEVELS in e-services based on good interpersonal relations
TARGET LEVEL A
INTERACTIONS WITH INDIVIDUAL STUDENTS AS MEMBERS OF SPECIFIC COMMUNITIES e.g. SMS services to an individual
member of the Erasmus community to tell students an appointment is confirmed with an Erasmus co-ordinator
TARGET LEVEL B
INTERACTIONS WITH INDIVIDUAL STUDENTS AS MEMBERS OF THE GENERAL STUDENT POPULATION e.g. SMS services to
an individual student that a library book has been fetched
TARGET LEVEL C
INTERACTIONS WITH ALL MEMBERS OF SPECIFIC COMMUNITIES e.g. provision of SMS services in English or the students own
language that there is a room change for that afternoon’s language class
TARGET LEVEL D
INTERACTIONS WITH ALL MEMBERS OF THE GENERAL STUDENT POPULATION
e.g. provision of SMS services to remind students that elections of student representatives are taking place
TARGET LEVEL E
CONTROL CATEGORY: NO TARGETING BECAUSE SERVICE DOES NOT EXIST this is essentially a “black mark”
13
STEP 4a:
DATA ANALYSIS: It is now time to see how these can be put in practice. We can revisit the interview described in Part I
The interviewer is realising that the student considers herself as having moved from
a University offering middle-of-the-road services to one with top-quality services. A
clear strategy to target Erasmus students through special web pages emerges later.
Example
STUDENT: “easy there you
know…you.. have lots of material
and a lot of time [in the laboratory]
Everything you need to work
INTERVIEWER: “so you found better
services there… much easier to study”
2
2
TARGET LEVEL B:
TARGET LEVEL B:
Example 1: Application of the classification systems to contextualised data:
e-learning in general
On-line library services were judged slightly poorer (books ordered in loco with a written
procedure as no-online reservation system was available) but the targeting of individual
students was better (an SMS message was sent to all students when a book had been
actually fetched) So that there was no OVERALL culture shock because the two differing
4
+
TARGET LEVEL B:
=
3
Example 2: Library services: how the two scales interact
14
Step 4b: A field application
Exploring the culture shock of visiting students at a Host University (in Netherlands)
Between 15 November 2006 and 5 December 2006 an experimental survey was carried out on the culture shock of visiting
students at a Host University (University E). Sixty-odd students who visited a Host University in the Netherlands in the academic
year 2006/2007 answered the following questions:
1.
2.
Which country do you come from?
Please tell us about your experiences here at Host University in comparison with your experiences at your home University in the fields:
a.
Use of e-learning in courses
(no, little, some, intensive, don’t know)
b.
Provision of a digital library
(no, very limited, fair, extensive, very extensive, don’t know)
c.
Use of Virtual Learning Environments
(not available, one VLE in use, several in use, don’t know)
d.
Language of computer tools and systems
(local language, local + English, several, don’t know)
e.
Availability of communication tools
(no, very limited, fair choice, wide range, don’t know)
f.
Provision of training and induction on computer tools and systems
(no, little, some sufficient, very good, don’t know)
g.
3.
4.
5.
6.
7.
8.
9.
10.
Provision of support and help on computer tools and systems
(no, little, some sufficient, very good, don’t know)
Did you experience ‘big transitions’, a sort of culture shock in the above fields when coming to your Host University?
Please give us concrete examples of “big transitions” you had to cope with:
How does it feel to make these transitions, do you feel lost? Can you cope with them?
Is help available to cope with the transitions?
Please tell us how you managed to cope with the transitions:
What features help to make the transitions, what don’t?
What makes in your opinion an Erasmus friendly (online) course?
Do you try to use your own University’s system if you can?
As may be appreciated from Table X, this survey marks a partial return to traditional techniques in data gathering: namely the use
of a questionnaire followed by statistical analysis of the replies. However, the questionnaires are influenced by the type of
concerns we have described above and are therefore concerned with attitudes. Only the first two question are of a closed type.
Instead, Questions 3-10 are either open-ended questions “What makes in your opinion an Erasmus friendly (online) or open-ended
instructions “Please tell us how you managed to cope with the transitions” for which traditional closed-set multiple choice and yes/no
answers do not apply. However, where possible students were asked to give their responses in terms of key words and short
notes so as to make it possible to analyse the data quickly. This inevitably leads to a lower resolution in terms of detail – the
microscope is definitely turned down – which is compensated for, however, by the fact that some overall trends can be picked
out which (see the conclusions below) require further research that goes beyond the exploratory scope of this Victorious Pilot.
There are thus few shortcuts when gathering data of this type and more questions are raised than can be answered.
We may comment on this field application by systematically exploring the results in relation to a series of tables that
interpret the results of each question.
Question 1: Firstly, we may analyze the cohort of respondents (62) from 19 different countries in terms of (i)
geographical and (ii) linguistic/cultural remoteness from the Host University. Geographical remoteness may be tabulated in terms
of distributions shown in the table on the right. These distributions are rough-and-ready rule-of-thumb characterisations drawn
up in relation to approximate geographical position vis-à-vis the Host University in the Netherlands. A slight predominance in
provenance is apparent from southerly and easterly countries, an asymmetry due mostly to the already westerly position of the
Netherlands and the low incidence of students from the British Isles (only 1 student from Ireland). In other respects, the cohort
shows considerable symmetry and representativeness. The cohort is particularly rich as regards language groups: 21 students are
from areas which are traditionally Romance-speaking (France, Italy Portugal, Spain, Romania), 17 from traditionally Germanicspeaking areas: English, Icelandic, Danish, German, Netherlands and 24 from 4 other language groups (a) Finland, Hungary; b)
Poland, Lithuania, Slovakia, Slovene; c) Turkey; d) Greece.
These distributions would clearly lead us to anticipate that the biggest negative shocks will come from students from
countries, cultures and languages that are remotest from the Host University (e.g. Greece, Turkey). We will see below that this is
emphatically not the case.
15
Table relating to Question 1
Country
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
Bulgaria,
Denmark,
Greece,
Hungary,
Ireland,
Portugal,
Romania,
Slovakia,
Turkey
Finland,
Iceland,
Lithuania
Netherlands
France,
Slovenia
Italy
Spain
Germany
Poland
Number of
respondents
1
1
1
1
1
1
1
1
1
2
2
2
3
4
4
7
8
10
11
62
Northern
Extreme
West
Westerly
Easterly
Extreme
East
Iceland (2)
Denmark (1)
Finland (2)
Poland (11)
18
Lithuania (2)
Slovakia (1)
Central
Ireland (1)
Netherlands
(3)
Germany
(10)
18
Hungary (1)
Romania (1)
Bulgaria (1)
Southern
Spain (8)
France (4)
Portugal (1)
12
8
Italy (7);
Greece (1)
Slovenia (4)
Turkey (1)
23
19
26
62
Question 2: This Table may be interpreted as evidence of the claim made in the Introduction to Part II of this report
that on the whole students tend to compare the Host University in a more favourable light vis-à-vis their Home University. More
research is required for example in the various geographical areas of the European Union to assess whether or not this bias is a
constant but it is clearly a factor that cannot be overlooked. On the whole, the findings tend to confirm that the transitions are
rarely massive and more likely to consist of one move up or down Scale 1 described in Step 2 above and likeòy to be mitigated by
compensatory factors. The most striking exception is in relation to language where the Host University scored highly (91% in
favour and only 5% against) as compared with the Home University in its provision (51% in favour and 49% against).
Question 2: Experiences at Host University in comparison with experiences at your Home University in the
following fields
a) Use of e-learning in courses
Home University
Host University
b) Provision of a digital library
Home University
Host University
c) Use of Virtual Learning Environments
Home University
Host University
d) Language of computer tools and systems
Home University
Host University
e) Availability of communication tools
Home University
Host University
f) Provision of training and induction on ICT
Home University
Host University
g) Provision of support and help on ICT
Home University
Host University
no / little
57%
37%
No / very limited
36%
20%
not available
31%
10%
local language
49%
5%
No / very limited
33%
12%
No / little / some
57%
48%
No / little / some
56%
36%
some / intensive
40%
56%
fair / extensive / very extensive
59%
61%
one / several in use
44%
61%
local & English or several
51%
92%
fair choice / wide range
64%
79%
sufficient / very good
35%
39%
sufficient / very good
34%
46%
Question 3: 85% of the students said that their Home University is rather comparable to Host University or that they
only experienced minor differences. 11% were a little bit “culture shocked”, only two experienced serious transitions.
One came from the Netherlands itself with a strong adverse reaction; the other came from France and was positively
“shocked”, he enjoyed the transitions. When we look at the data overall, the distribution into areas patterns out in the
way indicated in the following table.
16
Northern
Extreme
West
No culture
shock
Central
Minor
tremors:
both positive
and negative
Southern
Just a teeny
positive
twinge
Westerly
Easterly
Extreme
East
No culture
shock
Slight
negative
shock
Just a teeny
twinge
from just
round the
corner: still
quivering
with 100%
negative
culture shock
No culture
Minor
shock
tremors:
both positive
and negative
slight to
strong
positive
shock
Just a teeny
positive
twinge
No culture
shock
One reading of these results might be to see the strongest shocks (whether positive or negative) in terms of a series of concentric
circles that suggest that the farther away a University is both geographically and culturally and linguistically the less students are
inclined to be shocked precisely because they are well prepared for this and are attracted by the prestige associated with going to a
remote University. It is, on the other hand, more local students both geographically and culturally and linguistically who seem to
be less prepared because they had not expected to encounter any differences. Such a finding needs to be treated with caution
since the numbers are not very large. It does, however, show just how important it is to focus on intangible aspects of virtual
mobility which include prestige, independence and self-affirmation.
No culture shock
In between
Large culture
shock
Question 4: If the answers to Question 3 suggest the significance of language services in creating or removing barriers
to virtual mobility further evidence of the need for more research into the impact of language on transitions between Universities
is provided by students responses to Question 4 which asked students to list examples of big transitions. Language problems
were the first on the list. This does not mean that the Host University in question was in any failing in this respect. It does
suggest, however, that European Universities’ response to language problems in virtual mobility is uneven with many examples of
under-investment. The unevenness can be quickly gathered for example by comparing those European Universities whose
websites are entirely translated into English and those where only a few pages are in English.
Examples of big transitions:
• Language problems
• Availability of course information, materials, exam schedule
17
•
•
•
•
•
Use of VLE in contrast to use of paper material
Organization and size of classes, organization of exams, teaching approach, role of library
Role of the professors, communication with professors, behavior of professors
Bureaucracy
Mentality of local people
Questions 5 to 10 might be rephrased as a single question. How quickly did you settle in and what helped most? It is striking that
students had different perceptions of the help provided in coping with transitions but equally striking that the vast majority did
not experience any major difficulties, in keeping with the suggestion made above that both the University in question and the
students themselves were well prepared to cope with the transitions involved.
Question 5: 95% of the students had no problems or had problems only in the beginning of the year and cope well with them
now. 2 % felt rather lost.
Question 6: 36% of the students had enough help to cope with the transitions, 35% had some help, and 29% had no help
available.
Question 7: How to cope with the transitions:
• Ask local students, other Erasmus students, buddy, coordinator staff or teachers
• Participate in introductory sessions
• Use of internet or VLE
• Learn local language and English
• Adapt to local habits
• Patience, open mindedness, live together with local students, participate in sports, focus not to much on studies
Question 8: What features help to cope with transitions:
• Erasmus office
• Orientation days
• Introduction to VLE in the beginning of the year
• Information in English available online in advance
• List of FAQs
• Buddy system
Question 9: What makes an Erasmus friendly course:
• Use of English as teaching language
• Course information and course material online in time
• Personal support by teacher or staff
• A clear statement about ECTS, program and evaluation method.
• Online courses, interactive courses
• Good courses in respect of contents and teaching method
• Interaction, exchange of opinions with students of another nationality, religion, ethnical background, age…
Question 10: 70% of the students did not use their own University’s system. 30% did use it for
• e-mail
• e-journals, online library
• online dictionary, encyclopedia
• VLE, access to course material, Enrollment for courses and exams
• online language learning
Conclusions
What is the next step? Pilot 2 has been concerned with the development of tools that provide answers to the questions we raised
above namely: What experiences of virtual mobility do European students have and how do they view those experiences? Are
18
they positive or negative? Or are they mixed and subject to variation over a period of time? How do Universities take into
account and plan for the transitions from one service culture to another? How can transitions, associated culture shocks and
attempts by Universities help students take such transitions into their stride be accurately measured?
In response to these questions Pilot 2 has variously proposed, contributed to or experimented the following:
a satisfaction scale: that identifies and rates levels of satisfaction in terms of individual experience of e-services when
changing Universities
a good practice scale: that identifies and rates efforts to introduce special add-on services students, as individuals and/or
members of specific communities in support of e-mobility
data-gathering methods: that provide pilot data
validation methods: that test the validity of the scales on pilot data
application methods: that make it possible to apply the scales to larger data sets
a scale-based checklist: that allows individual European Universities to measure their performance against independent
standards rather than by direct comparison with specific competitors (see Final Report)
Further development is dependent on larger-scale application to answer questions relating to typical distributions, a matter that
by definition goes beyond the very nature of a pilot.
Acknowledgements
(i) Students who participated in the interviews are thanked for allowing their experiences to be shared. None of the students and
none of the Universities are identified other than in a general, abstract way based on numbers and letters for which no key is
provided. The students interviewed agreed, however, to allow their photos to be used on condition of anonymity.
(ii) This report was written by Anthony Baldry who acknowledges the help given by the other members of the Pavia team:
Caterina Guardamagna; Annalisa Golfredi; Nicola Martinelli; Stefania Nicola; Ivana Marenzi; Federica Panati Alessandra Varasi.
(iii) The report is based on materials supplied by other members of the Victorious team. Special thanks in this respect go to the
following: Bristol: Sue Timmis and Angela Joyce; Edinburgh: Jeff & Denise Haywood; Leuven: Nicky Rose; Siena: Cesare Zanca;
Tartu: Aune Valk.
19