PENED

Advanced Methods of Visual Computing for Cultural Heritage Applications

 
  Video or image

 



  Funding  
 
Funding Organization 
General Secretariat for Research and Technology
Funding Programme 
PENED 2003


  Consortium  
  - ITI-CERTH
- University of Athens (GR)
- Aristotle University of Thessaloniki (GR)
- Foundation of the Hellenic World (GR)
- University of Houston (US)
- Ecole Nationale des Ponts et Chaussees (FR)
 
  Link  
  www.iti.gr/db.php/en/projects.html/
CULTURAL_HERITAGE.html
 
   

 

Visual Computing has resulted in numerous advantages in medical image analysis, television and cinema and in the modeling of buildings and cities. The area of cultural heritage has not been affected that much, except from Virtual Reality (VR) applications. However, in this area there still exist perspectives for Visual Computing, which have started to be examined during the latest years. Additionally, in cultural heritage and especially archeology novel difficult and interesting visual computing problems have arisen.

The present research project aims at the development of advanced methods for the reconstruction and visualization of cultural entities and consists of the following 4 stages:

1. 3D content-based search of objects and parts of objects in multiple digitized databases with respect to the similarity of their material, their age, the similarity of their processing (i.e. sculpting, or painting) as well as of their texture and morphology.
2. Synthesis, i.e. placing recovered object parts to their correct relative positions.
3. Reconstruction, i.e. reconstruction of possible gaps and
4. Visualization, where the reconstructed object or scene should realistically appear with animated in real-time.

The degree of independence of the predescribed procedures defines the capability and the efficiency of the reconstruction. In the context of the research project 5 doctoral thesis will be elaborated in the scientific area of Visual Computing with cultural VR applications and will face open problems in most of the aforementioned fields.