Expo’ 98 celebrates 20 years in 2018, being inaugurated in May 22, 1998.
The history of CCG is linked to Expo’ 98, since Centro de Computação Gráfica was responsible for the Virtual Oceanarium project and for the development of the first version of Expo ’98’ website.
Integrated into the INI-GraphicsNet international network, CCG realized the Virtual Oceanarium project (1996 – 1998), which recreated the environment of the Lisbon Oceanarium, at the time the largest in Europe, with the help of Virtual Reality technologies, allowing virtual navigation by the same.
The Virtual Oceanarium was presented at the Pavilion of Oceanophilia for 6 weeks, passing in October to the Pavilion of Virtual Reality, inaugurating the new “Parque das Nações”.
It was also the cover of the prestigious scientific journal “Communications of the ACM” in 2000, with an article dedicated to the topic [read in full].
Source of illustrations: COMMUNICATIONS OF THE ACM July 2000/Vol. 43, No. 7
The research and development work of this project was carried out by a team that united Portugal (CCG) and Germany (IGD).
The main component of this work was a real-time simulation system that provided artificial animal species, optimized in order to allow a smooth simulation, animation, and navigation. In addition, it was implemented in a modular way, which allowed easy integration of new species and contents.
CCG was also responsible for the development of the 1998 World Exposition first website, which began four or five years before Expo ’98.
Having been founded in 1993, CCG thus began to develop the Expo ’98 website early on, when Expo ’98 did not yet have a domain or server.
Expo’ 98 Homepage, 1994 website, In Topics Magazine 1995
The “Papers@CCG” event will take place next June 13 at CCG – Centro de Computação Gráfica, at the Azurém Campus nº14, Universidade do Minho, in Guimarães.
This workshop will present and discuss the most recent articles published and accepted in conferences and journals of the specialty, resulting from the work developed by the different teams of the CCG.
The goal of the event is to present the scientific activity of this institution.
This event is addressed to the entire CCG community, Ph.D. students and the remaining community of the University of Minho.
Entrance is free and limited to the auditorium capacity.
The registration to this workshop can be done online: register now.
The event starts at 2:00 p.m. and is scheduled to end at 6:00 p.m.
In this workshop, which is the second “Papers@CCG” event organized by the CCG, the following topics will be addressed:
The authors will present the following scientific publications:
The scientific publications of the different domains of applied research of this institution can be viewed online on the CCG website.
The CCG (Centre for Computer Graphics) held a Think Tank session dedicated to the theme “Computer Graphics and Computer Vision”, on May 23, at its facilities, at the Azurém Campus of the University of Minho, Guimarães.
This session brought together about 30 representatives from a wide range of companies and institutions to discuss topics that related technology applied to three sectors: Health, Industry & Real Estate.
It was up to CVIG (Computer Vision Interaction and Graphics), the CCG’s Applied Research Domain intrinsically linked to computer graphics, to enhance the technological potentialities and opportunities associated with each of these sectors.
After the presentation of the projects and the work of the CCG / CVIG to the participants, 3 discussion groups (Health, Industry, and Construction) were formed, dividing the groups into 3 different spaces, debating multiple and interesting technological issues related to these areas.
In the health table the quantitative medical image was discussed, namely the support to the development of automatic systems of segmentation of tissues and organs; the development of methods for the detection and diagnosis of pathological lesions; and the massive exploration of spaces with the objective of selecting the set of characteristics with higher performance for a certain pathology.
In the textile industry, the advantages of automatic optical inspection in different phases of the production process were discussed, as well as the advantages of solutions for detecting, recognizing and following objects and people.
Regarding construction and real estate, the focus was on remote and local collaboration with mixed reality; in the information visualization systems of projects, real estate in the final target space; in the creation of virtual environments for simulation and prototyping; and also in the creation of systems of customization of spaces and products.
At the end of all the dynamization, the specific conclusions of the different groups were presented.
In health, the problem of the distance between R&D and users’ reality was taken into account, concluding the need for validation of tools and new approaches in precision medicine. In addition to the cost – benefit of technologies, the RGPD ‘s impact on data collection and the complexity of certification were addressed.
In the textile industry, regarding artificial vision, it was agreed that there were challenges in the inspection of fabrics with lists and patterns. The technology can have a strong impact in this chapter, with automatic inspection to solve these challenges, rather than manual inspection. The goal is to develop an optical inspection solution that does not detract from the production process, guaranteeing the speed of detection.
Under construction, the next steps to be taken in the application of augmented reality and virtual reality were identified, with the need to intervene in public policies that stimulate its use throughout the construction process and the exploitation of real estate assets.
These conclusions will be further detailed and made available to the participants in the event.
This action has the co-financing of the Operational Program Norte2020, and European Union, through the European Regional Development Fund, in the context of actions Transfer of Scientific and Technological Knowledge of the CCG.
The CCG – Centre for Computer Graphics – received a visit from the SONAE Retail Innovation Committee on 17 May.
As a partner of SONAE in different innovation projects, CCG has opened its doors to showcase some of the technologies that are being explored in its applied research domains (CVIG, EPMQ, PIU, and UMC).
For its part, SONAE has unveiled some of the innovation that is being prepared in logistics among the various branches of the group, namely in terms of supply chain, labeling, systems integration, video analytics, stock management, safety and ergonomics of workers, among others.
GCC is developing with SONAE a mixed reality project applied in the context of food retail logistics. This project allows to optimize the formation of the picking activity and to improve the logistics processes, among other advantages.
The CCG mixed-reality project was recently chosen from hundreds of technology projects to be present in SONAE’s 2017 annual results demonstration.
At the end of the event, while distributing SONAE’s “Retail Book of Innovation 2017”, the CCG took the opportunity to talk with SONAE’s Tania Calçada, Innovation & Future Tech Area Manager on the theme of the day: innovation.
CCG: Innovation is present in the various business areas of Sonae. Just a simple visit to the Sonae website allows us to find big numbers and different awards for innovation. Do you consider innovation to be a high priority for Sonae?
Sonae: Yes, indeed, SONAE chose innovation as one of its seven values. These values guide how employees relate to each other, to the outside world, and to how they should do business. Since then, the importance of innovation for the company has been demonstrated. In addition, we still have the mission: the word innovation once again appears in this statement, which has just over 20 words.
CCG: Is it frequent to do such visits as this to innovation centers?
Sonae: Yes, it is frequent and is not limited to the perimeter of the innovation team. Innovation being one of the values implies that the various teams have that responsibility.
If we look at the innovation book, most of the projects that appear have not been followed directly by the innovation team. What we do is an open call for projects to appear in the book and for internal innovation awards. People submit their projects and rarely the winners are projects that came out of the innovation team. External collaboration is naturally sought by the teams of these projects, drawing on partners such as the CCG who inspire them and later help to realize their ideas.
CCG: Why this particular visit to the CCG? Is it the result of a partnership that has already begun a year ago?
Sonae: Yes, it’s a visit that comes from a partnership. In fact, we have been working with the CCG for more than a year, and we were able to realize the first business in 2018, with a study related to ergonomics. But, CCG is currently working on more than one project with SONAE.
The mixed-reality project has also taken hold and is moving forward, so it makes sense to show the world, at least the internal world, that we are working together.
CCG: The CCG was recently in the Sonae results demonstration of 2017, where it presented a mixed reality project applied to training in the context of logistics, at the invitation of Sonae. Do you think that new innovation projects with Sonae could emerge from what you were able to discover from the CCG during this visit?
Sonae: I think so. The visits to the four working groups were very enriching. The CCG was careful to choose demonstrative projects applying to the reality of SONAE, which captured the attention and interest of the members of the committee. People very much enjoyed the fact that there is interactivity in some of the demonstrations and the fact that researchers themselves are presenting the projects where they are developing. In the group where I’ve been, I’ve seen people very interested; some follow-ups have already been scheduled.
CCG: Do you think that the recent technologies of virtual reality, augmented reality and mixed reality, typically more related to entertainment, will be very impacting for companies?
Sonae: I think so. Professionalizing entertainment technologies is something that has been making history. It results from companies having to keep abreast of the interests of customers and employees following social and behavioral trends. But the movement is also done in reverse.
Technology related to the gaming world has been used successfully in the business world. Ludification techniques, or gamification, are today used to motivate and create healthy competitiveness. Alternative realities have the potential to make work more appealing, effective and efficient.
I think the application of alternative realities in businesses will explode when devices, head-mounted displays are cheaper and robust. There is the talk of use cases with a lot of potentials, but these will only gain scale when the hardware is able to have applicability in intense operations, as is the case of logistics associated with retail. If the devices are still a barrier, already for the production of content we have seen examples of partners with many skills. The CCG is a good example.
Big Data. Big Data Analytics. Data Mining. Are you aware of these technological terms and their implications in the business world?
Big Data Analytics is a topic that covers various techniques and technologies that bring competitive advantages to companies and to the most diverse type of entities.
With the practice of Big Data Analytics you can gain visions of market behavior, improve internal work processes, and make faster, more informed decisions, among other advantages.
But let’s go in parts. If you are not yet aware of the Big Data Analytics activity, we first start with the basics, the Big Data concept.
To explain Big Data, a term that has changed the way we do business, we can talk about collecting, integrating, storing and processing massive amounts of data to extract useful information through analytical mechanisms (e.g., dashboards).
Further explaining its meaning, the Big Data concept is mainly based on the storage of huge volumes of data in distributed computing infrastructures, forming a cluster of computers interconnected in On-premises environments (inside the premises of the companies ) or in the Cloud.
And how do you collect this data? What are the data sources? And what are its main characteristics?
Organizations collect data from a variety of sources, from social networks to purchasing, from sensors to machine-to-machine transmissions, from filling forms to running a specific machine.
Examples of everyday data sources are e-mail, text documents, videos, audio files, purchases made on the internet, etc. Depending on the context, these data sources will have different characteristics and can be divided into three main characteristics:
All the work of data collection ends up having its purpose. In the business context, careful data analysis will reduce costs, increase revenue, streamline business processes, increase productivity, and streamline business processes.
The world’s largest companies use descriptive analysis (eg, dashboards, reports, KPIs) and machine learning methods to evaluate and make more informed and therefore more assertive decisions to improve the services they offer or the products they sell. Due to the constant increase of data sets available to the various organizations, these methods are now enabled by Big Data techniques, technologies and infrastructures, which guarantee the collection, storage, processing and analysis of these new types of data.
In Industry, the stoppages on the production line are equivalent to business and profits. The use of Industry 4.0 technology trends, such as Big Data techniques and technologies, or the Internet of Things, can reduce equipment downtime and downtime by measuring their overall effectiveness and repair needs.
The use of Big Data in Industry also allows for a much more detailed view of the effectiveness of the organization and its processes, leading to more informed, effective and efficient (real-time) decision-making processes.
Here we can talk about the Business Intelligence Platform for Data Integration, developed in a partnership that includes EPMQ, Bosch Car Multimedia Portugal and the University of Minho.
This project includes an integrated data system that allows, through an iterative process, the development of the Organizational Big Data Warehouse, increasing the quality of the operations of the factory, in terms of efficiency in access and quality of critical information, necessary for the decision-making and stakeholder involvement.
Big data is thus already a common expression of the CCG vocabulary. It is being developed in the field of applied research EPMQ, under the scientific coordination of Maribel Santos.
The DTx Digital Transformation CoLab – Experiencing the Future is a new collaborative laboratory that will develop activities in three main poles divided by Minho (Braga – Guimarães), Matosinhos and Évora, and will have CCG as an associated partner.
The notarial deed of incorporation of the Collaborative Laboratory in Digital Transformation (DTx) – Experiencing the Future, was held on May 10 in Guimarães. The “DTx CoLab” has the legal status of a non-profit association.
The CoLab will be coordinated by the University of Minho and led by Professor António Cunha.
DTx Digital Transformation CoLab aims to address new paradigms in products, services and the human-machine interface, as well as the consequent changes in industry and society, while promoting collaborative research and technological development, bridging the gap between the multidisciplinary academic knowledge and the diverse industrial competences.
In addition to the objective of applied research in different areas associated with digital transformation, this laboratory aims to create qualified employment and scientific employment in Portugal by implementing research and innovation agendas aimed at creating economic and social value.
The DTx has 18 partners/associates: Universidade do Minho; Universidade de Évora; Universidade Católica Portuguesa; Centro de Engenharia e Desenvolvimento (CEIIA); Laboratório Ibérico Internacional de Nanotecnologia (INL); Bosch Car Multimedia; Accenture; Embraer Portugal; IKEA Industry; Cachapuz-Bilanciai; Celoplás; ebankIT; Neadvance; NOS; Primavera; Simoldes Plásticos; TMG Automotive and WeDo Technologies.
As affiliated members are the centers of innovation: Centre for Computer Graphics (CCG) and Polo of Innovation in Engineering of Polymers (PIEP).
The Colab DTx is based in Guimarães, where it will also have laboratories, as well as in Braga, Matosinhos, and Évora.
In addition to the funding of its members is the support of EUR 7.5 million guaranteed by FCT (Fundação para a Ciência e a Tecnologia), within the scope of the Program of Creation of Collaborative Laboratories.
“The design and development of cyber-physical system products as well as evolutionary systems, integrating, for example, intelligent materials, digital manufacturing technologies and solutions based on artificial intelligence, will be the target of this Colab in the framework of a strong collaboration between entities of the scientific system and the economic-productive fabric, with international reference partners, such as MIT, “according to the press release.
DTx’s projects will address digitization in the development of products, systems, and manufacturing solutions, according to the collaborative lab’s technology roadmaps and partners’ challenges.
These activities will be carried out by contracted personnel, by teams of participating entities and by Ph.D. and Masters students.
The FCT had approved at the end of 2017 the proposal to create the DTx, with “great recognition for its scientific value and relevance in contributing to the strategic development of Portugal.”
More info at the official website DTX Digital Transformation CoLAB.
The Senior Inclusive project was featured in the program “O Norte Somos Nós”, in Porto Canal.
As the name implies, the Senior Inclusive is focused on the elderly population, seeking to provide better accompaniment to the elderly.
By creating equipment fit for the needs of the senior population, this project aims to keep older people active and to improve their sense of security. This allows to benefit the elderly, and at the same time a whole community of caregivers.
This equipment, made up of a tablet + wristband (both adapted to the hardware level, with an inclusive design and customized software depending on the user profile), allows the elderly to make simple and quick calls/video calls to their caregivers, family members, or friends.
It is also possible to interconnect the equipment with devices for measuring health parameters, with a real-time location and the detection of falls. Thus, it is possible to transmit emergency and/or abnormality alerts to the caregivers of the elderly.
Ergonomic adaptation to the device is another concern of this project.
The CCG, through its applied research domain CVIG, is collaborating in the development of the project software strand. In particular, it is creating a module of Virtual Assistants (Avatars), built in the image of people or characters known to the elderly. These avatars are an element of companionship always present in the system. Virtual assistants communicate with the elderly, with all of their functionality being remotely configured on the web and mobile platforms, by family members or caregivers.
CVIG is also working on the research and development of pattern recognition algorithms to build a drug recognition application through computer vision.
As Nelson Alves, a researcher at CCG, said in the “90 Seconds of Science” program: “We are working on software to support medication. This population, as a rule, is subject to the daily intake of various medications, which at some point may create some confusion or doubt. Through computer vision, image recognition algorithms and patterns, it is enough for the elderly to show the medicine box to the camera of this device, and he recognizes which drug is associated with it, helping him to take the right medication. ”
The Senior Inclusive project is currently in the midst of its development, and the testing phase has already begun with end users.
Senior Inclusive (project nº 17967) is supported by the European Regional Development Fund (ERDF) through the NORTE2020 – National Portuguese Operational Program.
The FAPA – From Audiovisual Perception to Action: the processing of spatiotemporal components – project was developed by CCG, by its applied research domain PIU, between 2015 and 2017.
The final results of this project were presented at the 12th Bial Foundation Symposium, which took place between April 4 and 7, 2018, at the Casa do Médico in Porto.
The FAPA project aimed at studying the spatial and temporal resolution of the auditory and visual systems individually and collectively in a series of dynamic tasks. The main objective of the project was to see if, with static and moving stimuli, we are able to estimate correctly where the event is located in space and time.
In short, as conclusions, we found out that we are able to:
In a real-time gait synchronization task, it was found that synchronization in the condition where only auditory information was available did not differ from the condition in which auditory and visual information was available.
As main outputs, this project contributed to the publication of three papers in international journals and to a chapter in a scientific book:
Check out these publications on the FAPA project, as well as all the scientific publications of the PIU applied research domain, on the CCG website.
The concept of Augmented Reality (AR) has already appeared more than 50 years ago, in 1962, but it is only nowadays that the bet on this technology has given clear results, with its applicability in the industry to prove advantageous for companies from various sectors. In this article, we look at the birth of augmented reality, as well as the existing technology, and for its various current applications, with particular emphasis on its use in the industry.
An augmented reality system has to combine the real world with virtual elements, it must be interactive, to run in real time, and the 3D objects must be in line with the real world. The virtual elements are associated to the sensorial modalities including: visual, auditory, touch, haptic and olfactory; however, most uses of RA technology are only for visual stimulation.
The first reference to the concept of Augmented Reality goes back to the year 1962 when Morton Heilig built a machine with immersive multisensory technology that he named Sensorama.
This machine offered an immersive multisensory RA experience as it was able to display 3D images, stereo sound, haptic sensations through body tilt and wind sensations and aromas (by way of example see the device developed by Tajuki Narumi).
In 1968, Ivan Sutherland developed the first head mounted display (HMD ) which he called The Sword of Damocles, which is a device used on the head, or an integral part of a helmet, which has an optical display in front of an HMD Monocular) or from each eye (HMD Binocular).
Although the concept has been around for a long time, the term Augmented Reality was only created in 1992 by Boeing researcher Tom Caudell. He and his colleague David Mizell had the challenge of providing an alternative to the diagrams and marking devices used to guide workers on the company’s factory floor. The solution they developed was an HMD that could be used by workers and that displayed the diagrams and schematics of the aircraft and projected them on reusable plates, and the information to visualize could be altered through a computer.
AR became popular among the masses through the game Pokémon GO  which was released in July 2016. The game used the GPS and camera devices compatible and allowed players to capture, battle, and train virtual creatures, called Pokémon, which appeared on device screens as if they were integrated into the real world.
Fruit of the success of the game, that had the adhesion of millions of players, the term AR left of its smaller niche and entered the mouth of the masses. The technology used to augment the virtual elements (in this case Pokémons) in the real world was geolocation. However, the most common technology for this purpose is Computer Vision (CV).
Due to the GPS error , the positioning of the Pokémon has a notable mistake (usually greater than 6 meters). This error in the positioning precision in this context of the game Pokemon GO is not very important, but for AR uses in industrial applications this error value makes its practical application impossible.
Image 1: Pokémon Go
AR technology for greater accuracy uses CV through two main approaches:
AR can be experienced through a large number of hardware devices such as smartphones, tablets, desktop or portable computers, smart glasses and HMDs.
AR HMDs are high-cost devices and have a very specialized set of hardware, which is what is needed to do the three-dimensional reconstruction of real space. Currently, the best-known AR HMD is Microsoft Hololens, which is actually a Mixed Reality device.
Image 2: HoloLens: Microsoft Press tool pictures.
Milgram  in 1994 defined something he called “Virtuality Continuum” : a scale ranging from the completely virtual, virtuality (which is commonly known as Virtual Reality) and the completely real, reality.
Between these two extremes lies the Mixed Reality (MR), which is the combination of the virtual world and the real world, where an interaction between real and virtual objects is possible.
AR is a type of MR that is closer to the extreme of reality. More than the device recognizes the real space and only superimposes virtual elements to it, the MR intends, for example, that a virtual lamp is placed on a real physical table and be affected by the ambient light so that it is not possible to distinguish it, whether it is real or virtual.
The MR also intends to respond to technical challenges such as the occlusion of a virtual element by a real element, which is an important characteristic so that the integration of the virtual with the real is perfect. In addition, it is intended with MR that it is possible to manipulate virtual objects. Microsoft Hololens allows you to address these MR goals and that is why it is considered such a device.
Image 3: Reality–virtuality continuum
One of the most anticipated MR devices is the Magic Leap One  which with its “Lightfield Photonics” technology, that promises to revolutionize the world of AR and MR. The technology promises to “generate digital light at different depths and combine perfectly with natural light to produce realistic digital objects that coexist in the real world.” The idea is that this advanced technology allows our brain to naturally process digital objects in the same way as real-world objects, in order to make smart glasses more comfortable to use for long periods of time.
The company Magic Leap, despite not even have launched one only device in the market, has already received a large investment (of 2.3 billion dollars) from several important investors such as Google, Alibaba, JP Morgan and Warner Bros among others . This is generating a great expectation, that there will be a huge technological revolution in the AR area.
An AR device very targeted for the industrial market is called Daqri SmartGlasses , a pair of glasses that have a high price.
Similar to these, but at a more affordable price, are the Moverio Pro BT 2000  from Epson and the Vuzix M300 . These types of AR devices are called smart glasses because they are more compact and have less processing power. These devices are useful for presenting 2D information and usually have a basic Android operating system such as mobile devices.
A famous example of this type of smart glasses was the Google Glasses that were discontinued  for the consumer market in January 2015, but they had a relaunch in the business market with the company Glass .
More recently, the industry has finally begun to realize that the application of AR is an innovative and differentiating factor in an increasingly competitive market. Its applications can be immense, as well as the advantages derived therefrom:
CCG, through its Computer Vision and Interaction Graphics (CVIG) domain, over the years, has been developing multiple projects focused on this technology, across several areas and sectors of industry.
Each of these projects brought specific advantages to the business in question. Some of these projects and applications are the following examples.
In the scope of the CCG’s bet in the Mixed Reality, a demonstration was developed, together with SONAE MC, for the application in the context of hypermarket logistics. This application was featured in the TVI24’s “NXT – The Next Step” program. The solution aims to optimize the formation of the picking activity in SONAE MC and to increase levels of safety at work.
This project was part of the annual results demonstration of SONAE. It consists of the development of an MR solution for training in the preparation of orders.
As it can be seen, augmented reality can be applied in industry in a variety of ways, to multiple companies or institutions. Thus, it is increasingly usual and natural to absorb this type of technology in the industrial and corporate universe. It was a long path, but a beautiful path to cross, and a path that still has much to show.
 T. Caudell and D. Mizell, “Augmented reality: an application of heads up display technology to manual manufacturing processes,” in System Sciences, 1992. Proceedings of the Twenty-Fifth Hawaii International Conference on, vol. ii, pp. 659–669 vol.2, Jan 1992.
 Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
 Milgram, Paul; H. Takemura; A. Utsumi; F. Kishino (1994). “Augmented Reality: A class of displays on the reality-virtuality continuum” (pdf). Proceedings of Telemanipulator and Telepresence Technologies. pp. 2351–34. Retrieved 2007-03-15.
About the author:
Nuno Sousa has a bachelor’s and a master’s degree, with specializations in Application Engineering and Software Analysis and Design, both in Computer Engineering, at the University of Minho. In the last 10 years, he has been involved in the development and implementation of several projects in the areas of Augmented Reality, Software Development (Web and Desktop), Human-Computer Interaction, Mobile Computing, Home Automation, Information Systems, and Software Engineering.
The 2018 Job Fair [IPVC Work +], of the Polytechnic Institute of Viana do Castelo (IPVC), will take place on April 11 and 12 at the Centro Cultural de Viana do Castelo.
The CCG – Centre for Computer Graphics – will be present at this job fair on April 12th.
This is an opportunity to get to know the CCG better, its projects and all its vacancies. Emphasis will be on the research grants and on the work contracts available in the different CCG projects.
CCG recruitment vacancies are available online all the time on the CCG recruitment page.
The Employment Fair is open from 9:30 a.m. to 6:30 p.m. It is included in the IPVC Summit, which has other activities, such as smart talks, pitch recruitment, masterships, workshops, among others.
You can consult the program of the event in pdf.
This summit is an annual networking event that offers students, teachers, and companies the chance to exchange experiences, knowledge, projects, and contacts.
The IPVC Summit is an innovative event in Portugal, where thousands of participants are concentrated. In this year of 2018, 8000 active participants are expected. Admission is free.
During two days will be gathered the various stakeholders of the IPVC, in the multiple activities taking place at the event, from the Job Fair to Education, from Smart Talks to Made in IPVC, etc.