Review:RTE

From Pel
Jump to navigation Jump to search

<accesscontrol>reviewer,pel</accesscontrol>

Robust Testing Environments (RTE)

Currently much time is spent in setting up system tests, trying to understand exactly what is being tested, understanding errors that occur during tests and verifying the test results. These issues can be especially difficult for distributed enterprise applications made up of a large number of software components that run on a collection of heterogeneous servers or in cloud environments. Load generation for accurate simulation of complex user behaviour is a growing challenge when testing large-scale critical systems. For example, with more and more applications moving to cloud-based infrastructures, typical load generation designed for applications residing on a local area network may be ineffective for robust testing. Research is required to develop new load generation techniques that take into consideration new cloud-based infrastructures where applications may be physically distributed across different infrastructures in different geographical locations and where network bandwidth may be unknown or at different quality levels at various points. This research will develop a set of prototype tools to allow for the more efficient system testing of large enterprise applications.

People involved

  • Sandra Theodora Buda - PhD Student, UCD
  • Vanessa Ayala - PhD Student, UCD
  • Seán Ó Rí­ordáin - PhD Student, TCD
  • Thomas Cerqueus - Postdoc researcher, UCD
  • Patrick McDonagh - Postdoc researcher, DCU
  • Anthony Ventresque - Postdoc researcher, UCD
  • Christina Thorpe - Postdoc researcher, UCD
  • Philip Perry - Postdoc researcher, UCD
  • Simon Wilson - Professor, TCD
  • John Murphy - Professor, UCD
  • Liam Murphy - Professor, UCD

Collaborations

  • IBM Dublin
  • Eduardo Cunha de Almeida - Assistant Professor, Federal University of Paraná (Brazil)
  • Gerson Sunyé - Associate Professor, University of Nantes (France)
  • Yves Le Traon, Professor, University of Luxembourg (Luxembourg)

Statement of work

The following table presents the deliverables for the project.

 Quarter   Deliverables   Status 
Q2 2012 Survey of data mining algorithms.  Deliv. 
201 Survey of anonymization techniques based on k-anonymity for microdata Sep-12 180  Deliv. 
201 High-level architecture design of a privacy-preserving data publishing assisting tool
Survey of data mining algorithms Mar-13 250
 Deliv. 
201 Initial algorithm to automatically extract relationships between schemas in the DB
Survey of algorithms for database sampling Jan-12 230
 Deliv. 
201 Design and specification of an algorithm for database sampling (table-level) Oct-12 150  Deliv. 
201 Evaluation of scalability and performance of the DB sampling algorithm  Deliv. 
201 Evaluation of the DB sampling algorithm on real test case May-13 210  Deliv. 
201 Design and specification of an algorithm for database sampling (attribute-level)
Survey of algorithms for database extrapolation Dec-13 160
 Deliv. 
201 Design and specification of an algorithm for database extrapolation
Survey of stress testing techniques May-14 150
 Deliv. 
201 Design of a stress testing methodology for Cloud applications
Survey of self-tuning/automatic testing techniques Sep-13 400
 Deliv. 
201 Design of a self-tuning methodology for Cloud applications Sep-13 400  Deliv. 
201 Review and implementation of previous approach
An analysis of Firefox bug report data Jan-12 410
 Deliv. 
201 Survey of the literature on statistical reliability in large scale software systems Mar-13 270  Deliv. 
201 Questionnaire: Testing the Cloud Jul-12 90  Deliv. 
Q3 2013 Organisation of TTC workshop (ISSTA 2013) Oct-12 300 website