沟通之前:希望您能花,三到五分钟的时间,观看我们的视频,对我们的能力,有一个初步判断。
德克萨斯大学毕业照展示

在过去的十年里,德克萨斯大学国际化已经被描述、辩论和偶尔谴责,一些学者声称它的必然性,而另一些学者则说它即将消亡。
但是,虽然学者们对于国际化可能或应该在世界各地的德克萨斯大学中产生什么可能或应该产生什么共识很少,但是大多数德克萨斯大学的雄心壮志是在战略性的、全面的层面上从事国际化。
从任务陈述到商业合作,机构已经扩展到远远超出传统的物理发送和接收德克萨斯大学的大学生来往国外的方法。
为此,随着德克萨斯大学发展被普遍称为高级国际官员或SIO的角色,新的职位不断被创造和完善。
这些个人以及他们不断发展的角色应该激发我们的兴趣:他们是谁?他们如何倡导和实施国际化?在他们的领导下,国际化会变成什么样子?他们如何解释有时模糊的总统授权,审查和谈判越来越多的跨国机会和协调整个校园社会的国际努力,将对21世纪的机构产生长期影响。
顶尖的德克萨斯大学管理者几乎总是国际化的倡导者,他们认为全球活动对未来成功至关重要。
因此,许多机构设立了高级行政职位,任务是使国际化在德克萨斯大学发生。
在大多数机构中,高级国际官员是一个相当新的发明,在过去几十年中才出现大量的数字。
他们的头衔各不相同,有的是董事,有的是院长、助理或副校长或挑衅者,但立场的主旨是一样的:在全球范围内引领全球倡议,确保机构实现与国际化相关的战略使命。
为了确定高级国际化官员及其角色的清晰图片,国际教育管理者协会(AIEA)在过去十年中定期调查其成员资格。
最新版本的调查,由Darla K Deardorff,AIEA执行董事,并由CK KUI汇编,显示了一个德克萨斯大学的专业领域,既有相似性和多样性的角色有关国际化。
更值得注意的发现之一是,由于德克萨斯大学为这些角色投入的资源不断增加,调查显示50%的受访者在SIO岗位上工作不到5年。
然而,那些填补不断增长的职位的人并不陌生的行政职位,大约60%的SOI年龄在51至65岁之间。
这些人都是经验丰富的学者,有一半在德克萨斯大学担任终身教职,这使他们成为越来越了解德克萨斯大学历史和未来的学术管理者。
不像其他新兴领域,如市场营销和社会媒体,领导力在这一点上起作用。
尽管SIO的平均年龄确实显示出缓慢下降的迹象,但学术地位在很大程度上仍旧是既定学者的领域。
根据AIEA的调查,超过半数的SIO在他们的领域里看到了新的职位,而这些职位的声望也随之提高。
反映这一趋势,SIOS所持有的标题已经从2011显著改变。
虽然“董事”是最多报道的头衔(25%的受访者),但是具有副董事长或助理副总裁/总理/省长的头衔(24%)的SIO数量有所增加,其次是副总统/总理/省长(18%)。
实施国际化SIO相关职位的持续发展是对国际化性质变化的回应。
传统上,国际化涉及人的流动,派遣德克萨斯大学的大学生和教师到国外在德克萨斯大学学习,包括服务性在德克萨斯大学学习或学术交流。
但AIEA报告表明,这些活动虽然仍然充足,但已经成为SIOS的次要内容。
相反,他们的头三项责任包括联系-伙伴关系,代表机构交易中的机构,以及国际化的战略规划。
对许多SIO来说,这些变化使他们成为国际化的创造者,而不仅仅是国际化的实践者。
作为其中的一部分,它们越来越紧密地与德克萨斯大学的中心运作紧密相连。
他们不再是支持德克萨斯大学的大学生交换的远程办公室。
相反,它们位于中央行政部门,尽管在一些较小的机构中,可能存在由于资金和利息波动而导致的持续问题,而这些波动可能与机构领导力的变化有关。
ng是高校的中心任务,在实施机构战略计划的组成部分中发挥着重要作用。
他们负责实现国际化,并且越来越多地从校园领导团队的职位开始这样做。
在未来几十年中,它们有着重要的力量来引领和指导国际化进程。
在担任SIO的高级学者手中,我们可以确信,他们所推行的外部化举措将把学院带到国外,把世界带回国内,为校园社区造福。
阿布扎比管理学院和格雷斯·卡拉姆·斯蒂芬森是加拿大多伦多德克萨斯大学安大略教育研究所的比较、国际和发展教育中心的高等教育和国际教育专家。
(当然,并非所有的)这些SIO职位,以我的经验(本文中提到的调查结果确实表明)"授予来自德克萨斯大学管理其他领域的经验丰富的德克萨斯大学管理人员(或者在其他一些情况下,授予来自f.ROM完全不同于高等教育领域,如银行业、市场营销或企业管理。
这些个人对于国际化(或全球化)意味着什么并不十分了解,更不用说如何通过将德克萨斯大学的学校的高尚愿景转化为坚实的战略计划和计划来使整个德克萨斯大学国际化了。
在很多情况下,他们几乎不会说任何外语,没有处理外国文化的经验,甚至没有处理国际德克萨斯大学的大学生的经验。
期望他们战略性地规划和发展德克萨斯大学的国际推广似乎是不可能的任务,此时,没有国际推广,任何德克萨斯大学都无法长期生存。
只要德克萨斯大学不打破高等教育全球化中的“内部晋升”模式,从本文标题看SIO的“成长角色”就只不过是一种“角色”而已,指派它跟随潮流,而不是真正的国际化。
Onale. George T Sipos在德克萨斯大学世界新闻脸谱网页面上
Universities determined to rise up international rankings are increasingly ‘playing’ the methodology, Shaun Curtis of the University of Exeter in the UK told the “Worldviews 2013” conference last week.
One way is to seek support from colleagues in other institutions who are answering rankings questionnaires, and another is to game the data.
Some universities, said Curtis, who is director of ‘International Exeter’, were encouraging people to support their institutions in reputation surveys.
Recently he received an email from a colleague at a partner university reminding him that a rankings questionnaire was on the horizon.
“The colleague listed the university’s achievements in recent years and the trajectory it had travelled – and a quite useful link to the questionnaire was given as well.
There was no direct approach, but you could see what was happening.
”It was also possible to play the data.
“I was amazed to see an advert from an Australian university that was looking to employ rankings managers on incredibly high salaries.
And why did they want to do that? Basically, you can play the rankings game.
“Perhaps a university can rise up the rankings because they have world-class data crunchers.
” Curtis was a panellist in a session on the relevance and rise of rankings, along with Bob Morse, director of data research at US News and World Report, Mary Dwyer, senior editor at Maclean’s magazine in Canada – they both produce national university and college rankings – and Phil Baty, editor of the Times Higher Education world rankings.
Curtis said that 5,000 of Exeter University’s 18,000 students were from outside the UK.
“Rankings therefore play a very important role in what we do.
” Rankings were explicit in the university’s strategy.
While rankings had flaws, that was no excuse for poor performance.
National rankings, Curtis contended, were more influential than global rankings.
“Context is key.
” For students, parents and recruitment agencies, it was more important to understand how Exeter was doing in relation to other UK universities, than in relation to institutions in other countries that people did not know.
Domestic rankings were more data driven, international rankings more perception driven.
However, international rankings had an important effect on prestige and so universities had to pay attention or risk being caught unawares.
British universities were starting to play the rankings game, sometimes quite blatantly, with attempts to exploit some of the methodologies.
“And that’s especially true for the international rankings, which are more perception driven.
”Curtis was also concerned about rankings affecting policy, with governments apparently wanting to concentrate funding in rankings winners.
“The media is influencing this policy debate.
”Obscenely powerfulPhil Baty of Times Higher Education, or THE, said, in a disembodied voice over an audio link from London, that rankings had become “obscenely powerful”.
Brazil was sending 100,000 students to study abroad only at ranked institutions, Russia was giving special recognition to degrees from top ranked universities, and India was only allowing in institutions that were globally ranked.
Another “rising power” – Twitter – was using the international rankings to decide where it would set up a research centre.
A measure of the importance of rankings were studies that had shown that they were the number one factor for students making university choices – more important than fees and, remarkably, course content.
“This reflects the huge investments made.
” If a student was spending six figures on a qualification, it was a bigger investment decision than buying a car or even a house.
“It’s about a brand, a lifelong career.
”While all university rankings had serious limitations and their power was not justified, they nevertheless had a very important role to play – but only if they were transparent and honest about inherent weaknesses.
Baty argued that global rankings were “more responsible” than national ones.
The thrust of his argument was that international rankings only tried to compare large, research-intensive universities around the world.
“We are only interested in selecting the global elite.
”Therefore, international rankings avoided the pitfall of national rankings, which compared big and small institutions, putting diverse groups of institutions in the same hierarchical list, “condemning outstanding local institutions as failures or somehow inferior”.
The same fate befell the THE’s competitors, which ranked large numbers of universities – 700 to 800 – rather than the 200 ranked by THE.
This did sound a little as if THE was making a strength out of a weakness.
The national rankersHowever, US News and Maclean’s said that their rankings had quickly realised that in a sector with diverse institutions, it was not useful to have all institutions measured against one another.
So both introduced categories for different types of institutions, in which it was possible to do ‘apples for apples’ comparisons.
Mary Dwyer of Maclean’s said three categories had been created for different types and sizes of institutions in Canada.
“What has changed is the amount of info available.
Now every university has its own website and there is a wealth of data online from other organisations.
”But while there was a lot more information available, it was “clear that students and parents are still looking to the media and rankings” – although rankings tended to be a starting point for people in choosing where to study.
The media played a crucial role, said Bob Morse of US News.
In the United States, the government would not undertake rankings, although research councils did comparisons, and “higher education would never rank itself”.
The media in America was seen as credible – this might not be the case in other countries.
There were dangers when rankings were connected to governments.
They could be seen as a direct tool to decide policies through the data.
“That’s a different kind of process than rankings whose purpose is to serve consumers.
”Dwyer said that universities and governments tended to hold off from rankings because there were lots of different interests jostling to be served.
Curtis agreed, voicing concern over the new international U-Multirank exercise being underwritten by a supranational government, the European Union, rather than being produced by the media.
Countering gamingRegarding universities gaming, Baty said rankings needed to be held up to scrutiny.
THE’s reputation survey was not open, it was distributed in various languages, and it made sure that people in enough countries were asked to join surveys.
“So we are working hard to iron out some of these biases.
”Dwyer said that Maclean’s got all its data from third-party sources, for example, research councils.
When the reputation survey was sent to universities, they all got the same number.
“With those types of controls, there is not enough data that universities can affect that much.
”Morse argued that rankings organisations should not believe their data were perfect.
“This is not the position.
We must be realistic in saying that it will be a battle to get the correct data and build a culture of data standards, because when the stakes are high people will make the effort.
”What everybody agreed on was that rankings were not going away.