CFP last date
20 December 2024
Reseach Article

Crowd Requirement Rating Technique (CrowdReRaT) Model for Crowd Sourcing

by Adetunji Oluwatofunmi, Oyenuga Ebenezer, Otuneme Nzechukwu
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 176 - Number 22
Year of Publication: 2020
Authors: Adetunji Oluwatofunmi, Oyenuga Ebenezer, Otuneme Nzechukwu
10.5120/ijca2020920178

Adetunji Oluwatofunmi, Oyenuga Ebenezer, Otuneme Nzechukwu . Crowd Requirement Rating Technique (CrowdReRaT) Model for Crowd Sourcing. International Journal of Computer Applications. 176, 22 ( May 2020), 9-14. DOI=10.5120/ijca2020920178

@article{ 10.5120/ijca2020920178,
author = { Adetunji Oluwatofunmi, Oyenuga Ebenezer, Otuneme Nzechukwu },
title = { Crowd Requirement Rating Technique (CrowdReRaT) Model for Crowd Sourcing },
journal = { International Journal of Computer Applications },
issue_date = { May 2020 },
volume = { 176 },
number = { 22 },
month = { May },
year = { 2020 },
issn = { 0975-8887 },
pages = { 9-14 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume176/number22/31329-2020920178/ },
doi = { 10.5120/ijca2020920178 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:43:11.444260+05:30
%A Adetunji Oluwatofunmi
%A Oyenuga Ebenezer
%A Otuneme Nzechukwu
%T Crowd Requirement Rating Technique (CrowdReRaT) Model for Crowd Sourcing
%J International Journal of Computer Applications
%@ 0975-8887
%V 176
%N 22
%P 9-14
%D 2020
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Requirement Engineering (RE) is an area of software engineering that handles the requirement and elicitation phase which involves the extraction of requirements from business users, clients, and or stakeholders. Producing software that is widely acceptable by prospective users is a function of the accuracy of the requirements gathered during the requirement and elicitation phase of software development. Crowd Requirement Engineering (CrowdRE) is an emerging method that utilizes the power of the crowd kicking out the traditional method for collecting software requirements. The power of a crowd is in its diversity of expertise and talents however, there is still the challenge of managing the crowd, analyzing, and annotating crowd requirements. This research work aims at developing a multi-level CrowdRE model known as the Crowd Requirement Rating Technique (CrowdReRaT) that enable annotation of requirements by different crowd members at various level. Several models were reviewed systematically to identify their areas of strengths and weaknesses. However, a recommendation for integrating a mini personality survey to know and understand the nature and skillsets of crowd members was suggested.

References
  1. P. K. Srivastava and R. Sharma, “Crowdsourcing to elicit requirements for MyERP application,” 1st Int. Work. Crowd-Based Requir. Eng. CrowdRE 2015 - Proc., pp. 31–35, 2015.
  2. I. Sommerville, Software Engineering, 9th ed. Pearson. India., 2011.
  3. E. C. Groen et al., “The Crowd in Requirements Engineering: The Landscape and Challenges,” IEEE Softw., vol. 34, no. 2, pp. 44–52, 2017.
  4. W. Maalej, H.-J. Happel, and A. Rashid, “When users become collaborators,” in In Proceedings of the 24th ACM SIGPLAN conference companion on Object oriented programming systems languages and app, 2009, p. 981.
  5. S. Kujala, M. Kauppinen, L. Lehtola, and T. Kojo, “The role of user involvement in requirements quality and project success,” in Proceedings of the IEEE International Conference on Requirements Engineering, 2005, pp. 75–84.
  6. E. C. Groen, “Crowd Out the Competition: Gaining Market Advantage through Crowd-Based Requirements Engineering,” 1st Int. Work. Crowd-Based Requir. Eng. CrowdRE 2015 - Proc., pp. 13–18, 2015.
  7. K. Mao, L. Capra, M. Harman, and Y. Jia, “A survey of the use of crowdsourcing in software engineering,” J. Syst. Softw., vol. 126, pp. 57–84, 2017.
  8. K. P. J. T. and R. A. M. Hosseini, “Toward Crowdsourcing for Requirements Elicitation: Results from Expert Survey,” in Proceedings of the Empirical Track of REFSQ, 2014.
  9. P. K. Murukannaiah, N. Ajmeri, and M. P. Singh, “Acquiring Creative Requirements from the Crowd: Understanding the Influences of Personality and Creative Potential in Crowd RE,” Proc. - 2016 IEEE 24th Int. Requir. Eng. Conf. RE 2016, pp. 176–185, 2016.
  10. M. Hosseini, E. C. Groen, A. Shahri, and R. Ali, “CRAFT: A crowd-annotated feedback technique,” Proc. - 2017 IEEE 25th Int. Requir. Eng. Conf. Work. REW 2017, pp. 170–175, 2017.
  11. M. Levy, I. Hadar, and D. Te’eni, “A gradual approach to crowd-based requirements engineering: The case of conference online social networks,” 1st Int. Work. Crowd-Based Requir. Eng. CrowdRE 2015 - Proc., pp. 25–30, 2015.
  12. A. Adepetu, K. A. Ahmed, Y. Al Abd, A. Al Zaabi, and D. Svetinovic, “CrowdREquire: A requirements engineering crowdsourcing platform,” AAAI Spring Symp. - Tech. Rep., vol. SS-12-06, no. Goodin 2005, pp. 2–7, 2012.
  13. M. Almaliki, C. Ncube, and R. Ali, “Adaptive software-based Feedback Acquisition: A Persona-based design,” in IEEE 9th International Conference on Research Challenges in Information Science (RCIS), 2015, pp. 100–111.
  14. M. A. Runco and G. J. Jaeger, “The Standard Definition of Creativity,” Creat. Res. J., vol. 24, no. 1, pp. 92–96, 2012.
  15. C. Puah, A. Z. Abu Bakar, and C. W. Ching, “Strategies for community based crowdsourcing,” in 2011 International Conference on Research and Innovation in Information Systems, ICRIIS’11, 2011, pp. 1–4.
Index Terms

Computer Science
Information Sciences

Keywords

CrowRE Requirement Engineering Crowd Requirement Rating Requirement Annotation.