Difference between revisions of "Review:RTE"

From Pel
Jump to navigation Jump to search
 
(28 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
<accesscontrol>reviewer,pel</accesscontrol>
 
<accesscontrol>reviewer,pel</accesscontrol>
  
{| <!--class="wikitable"-->
+
= Robust Testing Environments (RTE) =
 +
 
 +
Currently much time is spent in setting up system tests, trying to understand exactly what is being tested, understanding errors that occur during tests and verifying the test results. These issues can be especially difficult for distributed enterprise applications made up of a large number of software components that run on a collection of heterogeneous servers or in cloud environments. Load generation for accurate simulation of complex user behaviour is a growing challenge when testing large-scale critical systems. For example, with more and more applications moving to cloud-based infrastructures, typical load generation designed for applications residing on a local area network may be ineffective for robust testing. Research is required to develop new load generation techniques that take into consideration new cloud-based infrastructures where applications may be physically distributed across different infrastructures in different geographical locations and where network bandwidth may be unknown or at different quality levels at various points. This research will develop a set of prototype tools to allow for the more efficient system testing of large enterprise applications.
 +
 
 +
= People involved =
 +
 
 +
* Vanessa Ayala - PhD Student, UCD
 +
* Sandra Teodora Buda - PhD Student, UCD
 +
* Se&aacute;n &Oacute; R&iacute;­ord&aacute;in - PhD Student, TCD
 +
* Thomas Cerqueus - Postdoc researcher, UCD
 +
* Patrick McDonagh - Postdoc researcher, DCU
 +
* Philip Perry - Postdoc researcher, UCD
 +
* Christina Thorpe - Postdoc researcher, UCD
 +
* Anthony Ventresque - Postdoc researcher, UCD
 +
* John Murphy - Professor, UCD
 +
* Liam Murphy - Professor, UCD
 +
* Simon Wilson - Professor, TCD
 +
 
 +
= Collaborations =
 +
 
 +
* Morten Kristiansen - IBM Dublin
 +
* Eduardo Cunha de Almeida - Assistant Professor, Federal University of Paran&aacute; (Brazil)
 +
* Gerson Suny&eacute; - Associate Professor, University of Nantes (France)
 +
* Yves Le Traon, Professor, University of Luxembourg (Luxembourg)
 +
 
 +
= Statement of work =
 +
 
 +
The following table presents the deliverables for the project.
 +
 
 +
{| border="1"
 +
|- style="text-align: center;"
 +
! Quarter !! Deliverables !! Status
 +
|-
 +
| Q1 2013 || Survey of anonymization techniques based on k-anonymity for microdata || [[Media:RTE_D4.pdf|Done]]
 +
|-
 +
| Q3 2013 || High-level architecture design of a privacy-preserving data publishing assisting tool <br/> Survey of data mining algorithms || In process
 +
|-
 +
| Q3 2012 || Initial algorithm to automatically extract relationships between schemas in the DB || [[Media:RTE_D1.pdf|Done]]
 +
|-
 +
| Q3 2012 || Survey of algorithms for database sampling || [[Media:RTE_D3.pdf|Done]]
 +
|-
 +
| Q1 2013 || Design and specification of an algorithm for database sampling (table-level) || [[Media:RTE_D2.pdf|Done]]
 +
|-
 +
| Q4 2013 || Evaluation of scalability and performance of the DB sampling algorithm || In process
 +
|-
 +
| Q2 2014 || Evaluation of the DB sampling algorithm on real test case ||
 +
|-
 +
| Q3 2014 || Design and specification of an algorithm for database sampling (attribute-level)<br/>Survey of algorithms for database extrapolation ||
 +
|-
 +
| Q3 2014 || Design and specification of an algorithm for database extrapolation <br/> Survey of stress testing techniques ||
 
|-
 
|-
! Quarter !! Deliverables !! Files
+
| Q3 2014 || Design of a stress testing methodology for Cloud applications <br/> Survey of self-tuning/automatic testing techniques ||
 
|-
 
|-
| Jun 2011  || Survey of algorithms for load generation ||  
+
| Q3 2014 || Design of a self-tuning methodology for Cloud applications ||
 
|-
 
|-
| Sep 2011 || Survey of load generation tools for complex critical systems ||
+
| Q1 2013 || Review and implementation of previous approach <br/> An analysis of Firefox bug report data || [[Media:RTE_D5.pdf|Done]]
 
+
|-
<!--Dec 2011
+
| Q4 2013 || Survey of the literature on statistical reliability in large scale software systems || In process
+
|-
Report on robustness of load generation algorithms
+
| Q3 2012 || Questionnaire: Testing the Cloud || [[Media:RTE_D6.pdf|Done]]
+
|-
Mar 2012
+
| Q3 2013 || Organisation of the Testing The Cloud workshop (ISSTA 2013) || [http://ttc2013.ucd.ie Done]
Specification of algorithms for robust load generation
 
Jun 2012
 
 
Design of data set generation tool
 
Sep 2012
 
 
Specification of a load generation tool-suite for complex critical systems
 
 
Dec 2012
 
Initial Prototype data set generation tool
 
Mar 2013
 
 
Review of algorithms for data set creation
 
 
Jun 2013
 
Design of algorithms for robust load generation
 
Sep 2013
 
 
Initial catalogue of robust data sets
 
Dec 2013
 
 
Design of a load generation tool-suite for complex critical systems
 
 
Mar 2014
 
Initial prototype test script visualisation tool
 
Jun 2014
 
 
Initial Prototype of a load generation tool-suite for complex critical systems
 
Sep 2014
 
 
Prototype data set generation tool
 
 
Dec 2014
 
Catalogue of robust data sets
 
Mar 2015
 
 
Implementation of algorithms for data set creation
 
 
Jun 2015
 
Prototype of a load generation tool-suite for complex critical systems
 
Sep 2015
 
 
Prototype test script visualisation tool
 
Dec 2015
 
Set of implemented tools for robust testing environment-->
 
 
|}
 
|}

Latest revision as of 22:15, 7 June 2013

<accesscontrol>reviewer,pel</accesscontrol>

Robust Testing Environments (RTE)

Currently much time is spent in setting up system tests, trying to understand exactly what is being tested, understanding errors that occur during tests and verifying the test results. These issues can be especially difficult for distributed enterprise applications made up of a large number of software components that run on a collection of heterogeneous servers or in cloud environments. Load generation for accurate simulation of complex user behaviour is a growing challenge when testing large-scale critical systems. For example, with more and more applications moving to cloud-based infrastructures, typical load generation designed for applications residing on a local area network may be ineffective for robust testing. Research is required to develop new load generation techniques that take into consideration new cloud-based infrastructures where applications may be physically distributed across different infrastructures in different geographical locations and where network bandwidth may be unknown or at different quality levels at various points. This research will develop a set of prototype tools to allow for the more efficient system testing of large enterprise applications.

People involved

  • Vanessa Ayala - PhD Student, UCD
  • Sandra Teodora Buda - PhD Student, UCD
  • Seán Ó Rí­ordáin - PhD Student, TCD
  • Thomas Cerqueus - Postdoc researcher, UCD
  • Patrick McDonagh - Postdoc researcher, DCU
  • Philip Perry - Postdoc researcher, UCD
  • Christina Thorpe - Postdoc researcher, UCD
  • Anthony Ventresque - Postdoc researcher, UCD
  • John Murphy - Professor, UCD
  • Liam Murphy - Professor, UCD
  • Simon Wilson - Professor, TCD

Collaborations

  • Morten Kristiansen - IBM Dublin
  • Eduardo Cunha de Almeida - Assistant Professor, Federal University of Paraná (Brazil)
  • Gerson Sunyé - Associate Professor, University of Nantes (France)
  • Yves Le Traon, Professor, University of Luxembourg (Luxembourg)

Statement of work

The following table presents the deliverables for the project.

Quarter Deliverables Status
Q1 2013 Survey of anonymization techniques based on k-anonymity for microdata Done
Q3 2013 High-level architecture design of a privacy-preserving data publishing assisting tool
Survey of data mining algorithms
In process
Q3 2012 Initial algorithm to automatically extract relationships between schemas in the DB Done
Q3 2012 Survey of algorithms for database sampling Done
Q1 2013 Design and specification of an algorithm for database sampling (table-level) Done
Q4 2013 Evaluation of scalability and performance of the DB sampling algorithm In process
Q2 2014 Evaluation of the DB sampling algorithm on real test case
Q3 2014 Design and specification of an algorithm for database sampling (attribute-level)
Survey of algorithms for database extrapolation
Q3 2014 Design and specification of an algorithm for database extrapolation
Survey of stress testing techniques
Q3 2014 Design of a stress testing methodology for Cloud applications
Survey of self-tuning/automatic testing techniques
Q3 2014 Design of a self-tuning methodology for Cloud applications
Q1 2013 Review and implementation of previous approach
An analysis of Firefox bug report data
Done
Q4 2013 Survey of the literature on statistical reliability in large scale software systems In process
Q3 2012 Questionnaire: Testing the Cloud Done
Q3 2013 Organisation of the Testing The Cloud workshop (ISSTA 2013) Done