Background Each organization aims to achieve a high quality software product that meet schedule and budget. However, to achieve such aim seems impossible. It is because each factor is limited and tradeoff between them cannot be avoided. Therefore, the trade off/compromise takes part in software product as a ‘as-is’ software which completely may have wide gap with ‘ideal’ software. The ‘as-is’ software may contain design flaws. The flaws may accumulates from early development and continuously grows if there is no action to manage and control it. This symptom called a Technical Debt(TD). A few organizations do not aware of this symptom and take it lightly as low priority bugfix or short-term refactoring task. If lightly concern has been taken for this symptoms, it will slowly spread as a ‘cancer’ to the software product and the organization. It is because the accumulated design flaws can be complex to fix and need huge resources to refactor it. Fail to manage it, it may leave as unmaintainable software (i.e., TD bankruptcy).
Therefore, to manage and control the TD, first and foremost task is to make the organization aware of it. The organization can highlight the TD by identifying code smells. A code smell indicates a structural weaknesses reside in the system and can be detected by a set of metrics. Although a lot of code smells detection tools have been developed, most of them are specific for one language paradigm, less attention on customizable and adding metrics and registering new smells, and less extensibility for other components/features.
Goal Thus, this thesis aims to develop an infrastructure (based Moose platform) that can:
- Detect code smells by reusing a set of metrics in the literature for code smells detection
- Register new smells by customizing/configuring existing metrics
- Model detected smells for storing (e.g., JSON, MSE(moose metamodel language), etc.)
- Save detected smells into repository
- Query/browse detected smells
- Easily extend for other features (e.g., threshold calculation, TD value monitoring, historical information) in the future
Task Description Phase 1: Implement existing a set of metrics (with threshold) for code smells detection and evaluate the result detection
- Map provided metrics from Moose platform with a set of metrics for code smells detection
- Set/Define threshold appropriately for each smell
- Evaluate the smells detected with other code smells detection tools to see its capability and false positive result
- Calibrate metrics or/and threshold, if possible, for improving detection result
Phase 2: Develop features/components that have following functionalities:
- List of detected smell and its metrics value
- Save/model all detected smells into repository
- Query mechanism to filter/select intended smells
- Possible to extend/adapt with other possible features (such as retrieve historical smells information, TD values visualization etc.)
Phase 3: Validate the prototyped tools with small and medium-sized open source project
- Moha, N., Gueheneuc, Y.-G., Duchien, L., & Le Meur, A.-F. (2010). DECOR: A Method for the Specification and Detection of Code and Design Smells. IEEE Transactions on Software Engineering, 36(1), 20–36. doi:10.1109/TSE.2009.50
- Lanza, M., & Marinescu, R. (2006). Object-oriented metrics in practice: Using software metrics to characterize, evaluate, and improve the design of object-oriented systems. Object-Oriented Metrics in Practice: Using Software Metrics to Characterize, Evaluate, and Improve the Design of Object-Oriented Systems. Springer Berlin Heidelberg.
- Fontana, F. A., Ferme, V., Zanoni, M., & Roveda, R. (2015). Towards a Prioritization of Code Debt : A Code Smell Intensity Index, 16–24. The Story of Moose: an Agile Reengineering Environment. (n.d.). doi:1-59593-014-0/05/0009
- Stéphane Ducasse, T. G. M. L. S. D. (n.d.). Moose: a Collaborative and Extensible Reengineering Environment. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.67.7264