Review:RTE

From Pel
Revision as of 15:45, 29 May 2013 by Thomas (talk | contribs)
Jump to navigation Jump to search

<accesscontrol>reviewer,pel</accesscontrol>

Robust Testing Environments (RTE)

Currently much time is spent in setting up system tests, trying to understand exactly what is being tested, understanding errors that occur during tests and verifying the test results. These issues can be especially difficult for distributed enterprise applications made up of a large number of software components that run on a collection of heterogeneous servers or in cloud environments. Load generation for accurate simulation of complex user behaviour is a growing challenge when testing large-scale critical systems. For example, with more and more applications moving to cloud-based infrastructures, typical load generation designed for applications residing on a local area network may be ineffective for robust testing. Research is required to develop new load generation techniques that take into consideration new cloud-based infrastructures where applications may be physically distributed across different infrastructures in different geographical locations and where network bandwidth may be unknown or at different quality levels at various points. This research will develop a set of prototype tools to allow for the more efficient system testing of large enterprise applications.

People involved

  • Sandra Theodora Buda - PhD Student, UCD
  • Vanessa Ayala - PhD Student, UCD
  • Seán Ó Rí­ordáin - PhD Student, TCD
  • Thomas Cerqueus - Postdoc researcher, UCD
  • Patrick McDonagh - Postdoc researcher, DCU
  • Anthony Ventresque - Postdoc researcher, UCD
  • Christina Thorpe - Postdoc researcher, UCD
  • Philip Perry - Postdoc researcher, UCD
  • Simon Wilson - Professor, TCD
  • John Murphy - Professor, UCD
  • Liam Murphy - Professor, UCD

Collaborations

  • IBM Dublin
  • Eduardo Cunha de Almeida - Assistant Professor, Federal University of Paraná (Brazil)
  • Gerson Sunyé - Associate Professor, University of Nantes (France)
  • Yves Le Traon, Professor, University of Luxembourg (Luxembourg)

Statement of work

The following table presents the deliverables for the project.

 Quarter   Deliverables   Status* 
Q1 2013 Survey of anonymization techniques based on k-anonymity for microdata D
Q3 2013 High-level architecture design of a privacy-preserving data publishing assisting tool
Survey of data mining algorithms
IP
Q3 2012 Initial algorithm to automatically extract relationships between schemas in the DB
Survey of algorithms for database sampling
D
Q1 2013 Design and specification of an algorithm for database sampling (table-level) D
Q4 2013 Evaluation of scalability and performance of the DB sampling algorithm D
Q2 2014 Evaluation of the DB sampling algorithm on real test case IP
Q3 2014 Design and specification of an algorithm for database sampling (attribute-level)
Survey of algorithms for database extrapolation
D
Q3 2014 Design and specification of an algorithm for database extrapolation
Survey of stress testing techniques
IP
Q3 2014 Design of a stress testing methodology for Cloud applications
Survey of self-tuning/automatic testing techniques
NS
Q3 2014 Design of a self-tuning methodology for Cloud applications NS
Q1 2013 Review and implementation of previous approach
An analysis of Firefox bug report data
D
Q4 2013 Survey of the literature on statistical reliability in large scale software systems NS
Q3 2012 Questionnaire: Testing the Cloud D
Q3 2013 Organisation of the Testing The Cloud workshop (ISSTA 2013) IP

* D: Done, IP: In Process, NS: Not Started