We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 November 2024
Call for Paper
December Edition
IJCA solicits high quality original research papers for the upcoming December edition of the journal. The last date of research paper submission is 20 November 2024

Submit your paper
Know more
Reseach Article

Smart Citizen Sensing: A Proposed Computational System with Visual Sentiment Analysis and Big Data Architecture

by Kaoutar Ben Ahmed, Mohammed Bouhorma, Mohamed Ben Ahmed
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 152 - Number 6
Year of Publication: 2016
Authors: Kaoutar Ben Ahmed, Mohammed Bouhorma, Mohamed Ben Ahmed
10.5120/ijca2016911880

Kaoutar Ben Ahmed, Mohammed Bouhorma, Mohamed Ben Ahmed . Smart Citizen Sensing: A Proposed Computational System with Visual Sentiment Analysis and Big Data Architecture. International Journal of Computer Applications. 152, 6 ( Oct 2016), 20-27. DOI=10.5120/ijca2016911880

@article{ 10.5120/ijca2016911880,
author = { Kaoutar Ben Ahmed, Mohammed Bouhorma, Mohamed Ben Ahmed },
title = { Smart Citizen Sensing: A Proposed Computational System with Visual Sentiment Analysis and Big Data Architecture },
journal = { International Journal of Computer Applications },
issue_date = { Oct 2016 },
volume = { 152 },
number = { 6 },
month = { Oct },
year = { 2016 },
issn = { 0975-8887 },
pages = { 20-27 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume152/number6/26324-2016911880/ },
doi = { 10.5120/ijca2016911880 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:57:27.861751+05:30
%A Kaoutar Ben Ahmed
%A Mohammed Bouhorma
%A Mohamed Ben Ahmed
%T Smart Citizen Sensing: A Proposed Computational System with Visual Sentiment Analysis and Big Data Architecture
%J International Journal of Computer Applications
%@ 0975-8887
%V 152
%N 6
%P 20-27
%D 2016
%I Foundation of Computer Science (FCS), NY, USA
Abstract

A city’s “smartness” depends greatly on citizens’ participation in smart city services. Furthermore, citizens are becoming technology-oriented in every aspect concerning their convenience, comfort and safety. Thus, they become sensing nodes—or citizen sensors—within smart-cities with both static information and a constantly emitting activity system. This paper presents a novel approach to perform visual sentiment analysis of big visual data shared on social networks (such as Facebook, Twitter, LinkedIn, and Pinterest) using transfer learning. The proposed approach aims at contributing to smart citizens sensing area of smart cities. This work explores deep features of photos shared by users in Twitter via convolutional neural networks and transfer learning to predict sentiments. Moreover, we propose big data architecture to extract, save and transform raw Twitter image posts into useful insights. We obtained an overall prediction accuracy of 83.35%, which indicates that neural networks are indeed capable of predicting sentiments. Therefore, revealing interesting research opportunities and applications in the domain of smart sensing.

References
  1. “State of the World’s Cities 2012/2013.” [Online]. Available: http://mirror.unhabitat.org/pmss/listItemDetails.aspx?publicationID=3387. [Accessed: 08-Sep-2016].
  2. T. Stonor, “Smart cities – why, what, how, how?,” The power of the network, 06-Jun-2013. .
  3. J. Gabrys, “Programming Environments: Environmentality and Citizen Sensing in the Smart City,” Environ. Plan. Soc. Space, vol. 32, no. 1, pp. 30–48, Feb. 2014.
  4. S. Graham and A. Aurigi, “Urbanising cyberspace?,” City, vol. 2, no. 7, pp. 18–39, May 1997.
  5. A. Mahizhnan, “Smart cities: The Singapore case,” Cities, vol. 16, no. 1, pp. 13–18, Feb. 1999.
  6. L. Anthopoulos and P. Fitsilis, “Exploring architectural and organizational features in smart cities,” in 16th International Conference on Advanced Communication Technology, 2014, pp. 190–195.
  7. R. E. Hall, B. Bowerman, J. Braverman, J. Taylor, H. Todosow, and U. Von Wimmersperg, “The vision of a smart city,” Brookhaven National Lab., Upton, NY (US), 2000.
  8. D. Havlik, S. Schade, Z. A. Sabeur, P. Mazzetti, K. Watson, A. J. Berre, and J. L. Mon, “From Sensor to Observation Web with Environmental Enablers in the Future Internet,” Sensors, vol. 11, no. 4, pp. 3874–3907, Mar. 2011.
  9. M. Conti, M. Kumar, and others, “Opportunities in opportunistic computing,” Computer, vol. 43, no. 1, pp. 42–50, 2010.
  10. J. A. Burke, D. Estrin, M. Hansen, A. Parker, N. Ramanathan, S. Reddy, and M. B. Srivastava, “Participatory sensing,” Cent. Embed. Netw. Sens., 2006.
  11. A. Kavanaugh, A. Ahuja, M. Pérez-Quiñones, J. Tedesco, and K. Madondo, “Encouraging Civic Participation Through Local News Aggregation,” in Proceedings of the 14th Annual International Conference on Digital Government Research, New York, NY, USA, 2013, pp. 172–179.
  12. B. Liu, “Sentiment analysis and opinion mining,” Synth. Lect. Hum. Lang. Technol., vol. 5, no. 1, pp. 1–167, 2012.
  13. D. Joshi, R. Datta, E. Fedorovskaya, Q.-T. Luong, J. Z. Wang, J. Li, and J. Luo, “Aesthetics and emotions in images,” IEEE Signal Process. Mag., vol. 28, no. 5, pp. 94–115, 2011.
  14. Q. You, J. Luo, H. Jin, and J. Yang, “Robust image sentiment analysis using progressively trained and domain transferred deep networks,” ArXiv Prepr. ArXiv150906041, 2015.
  15. A. T. Campbell, S. B. Eisenman, N. D. Lane, E. Miluzzo, and R. A. Peterson, “People-centric urban sensing,” in Proceedings of the 2nd annual international workshop on Wireless internet, 2006, p. 18.
  16. N. Gross, “14: The Earth Will Don An Electronic Skin,” Bloomberg.com, 30-Aug-1999.
  17. D. Cuff, M. Hansen, and J. Kang, “Urban Sensing: Out of the Woods,” Commun ACM, vol. 51, no. 3, pp. 24–33, Mar. 2008.
  18. Frost & Sullivan, “Strategic Opportunity Analysis of the Global Smart City Market Report Brochure,” 2013. [Online]. Available: http://www.frost.com/prod/servlet/report-brochure.pag?id=M920-01-00-00-00. [Accessed: 10-Sep-2016].
  19. E. Miluzzo, N. D. Lane, S. B. Eisenman, and A. T. Campbell, “CenceMe–injecting sensing presence into social networking applications,” in European Conference on Smart Sensing and Context, 2007, pp. 1–28.
  20. A. Sheth, “Citizen sensing, social signals, and enriching human experience,” IEEE Internet Comput., vol. 13, no. 4, p. 87, 2009.
  21. X. Hu, T. H. S. Chu, H. C. B. Chan, and V. C. M. Leung, “Vita: A Crowdsensing-Oriented Mobile Cyber-Physical System,” IEEE Trans. Emerg. Top. Comput., vol. 1, no. 1, pp. 148–165, Jun. 2013.
  22. B. C. Beaumont, “Mumbai attacks: Twitter and Flickr used to break news,” 27-Nov-2008.
  23. R. Arunachalam and S. Sarkar, “The New Eye of Government: Citizen Sentiment Analysis in Social Media,” in Sixth International Joint Conference on Natural Language Processing, p. 23.
  24. J. Villena-Román, “TweetAlert: Semantic Analytics in Social Networks for Citizen Opinion Mining in the City of the Future.”
  25. B. Guthier, R. Alharthi, R. Abaalkhail, and A. El Saddik, “Detection and Visualization of Emotions in an Affect-Aware City,” in Proceedings of the 1st International Workshop on Emerging Multimedia Applications and Services for Smart Cities, New York, NY, USA, 2014, pp. 23–28.
  26. A. Vakali, L. Anthopoulos, and S. Krco, “Smart Cities Data Streams Integration: Experimenting with Internet of Things and Social Data Flows,” in Proceedings of the 4th International Conference on Web Intelligence, Mining and Semantics (WIMS14), New York, NY, USA, 2014, pp. 60:1–60:5.
  27. D. Osimo and F. Mureddu, “Research challenge on opinion mining and sentiment analysis,” Univ. Paris-Sud Lab. LIMSI-CNRS Bâtim., vol. 508, 2012.
  28. S. B. Davis and M. Saunders, “How social media can help improve and redesign transport systems,” The Guardian, 17-Jun-2014.
  29. F. Antonelli, M. Azzi, M. Balduini, P. Ciuccarelli, E. D. Valle, and R. Larcher, “City Sensing: Visualising Mobile and Social Data About a City Scale Event,” in Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces, New York, NY, USA, 2014, pp. 337–338.
  30. Athena Vakali, Despoina Chatzakou, Vassiliki Koutsonikola, and Georgios Andreadis, “Social Data Sentiment Analysis in Smart Environments - Extending Dual Polarities for Crowd Pulse Capturing:,” 2013, pp. 175–182.
  31. A. D. I. Kramer, “An Unobtrusive Behavioral Model of ‘Gross National Happiness,’” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2010, pp. 287–290.
  32. E. Cambria, “Affective computing and sentiment analysis,” IEEE Intell. Syst., vol. 31, no. 2, pp. 102–107, 2016.
  33. Y. Kim, “Convolutional neural networks for sentence classification,” ArXiv Prepr. ArXiv14085882, 2014.
  34. C. Xu, S. Cetintas, K.-C. Lee, and L.-J. Li, “Visual Sentiment Prediction with Deep Convolutional Neural Networks,” ArXiv14115731 Cs Stat, Nov. 2014.
  35. C. N. dos Santos and M. Gatti, “Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts.”
  36. T. Chen, D. Borth, T. Darrell, and S.-F. Chang, “DeepSentiBank: Visual Sentiment Concept Classification with Deep Convolutional Neural Networks,” ArXiv14108586 Cs, Oct. 2014.
  37. Q. You, J. Luo, H. Jin, and J. Yang, “Joint Visual-Textual Sentiment Analysis with Deep Neural Networks,” in Proceedings of the 23rd ACM International Conference on Multimedia, New York, NY, USA, 2015, pp. 1071–1074.
  38. Y. Yu, H. Lin, J. Meng, and Z. Zhao, “Visual and Textual Sentiment Analysis of a Microblog Using Deep Convolutional Neural Networks,” Algorithms, vol. 9, no. 2, p. 41, Jun. 2016.
  39. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolutional neural networks,” in Advances in neural information processing systems, 2012, pp. 1097–1105.
  40. R. Girshick, J. Donahue, T. Darrell, and J. Malik, “Rich feature hierarchies for accurate object detection and semantic segmentation,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2014, pp. 580–587.
  41. N. Dalal and B. Triggs, “Histograms of oriented gradients for human detection,” in 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), 2005, vol. 1, pp. 886–893 vol. 1.
  42. D. G. Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,” Int. J. Comput. Vis., vol. 60, no. 2, pp. 91–110, Nov. 2004.
  43. K. Chatfield, K. Simonyan, A. Vedaldi, and A. Zisserman, “Return of the devil in the details: Delving deep into convolutional nets,” ArXiv Prepr. ArXiv14053531, 2014.
  44. L. Torresani, M. Szummer, and A. Fitzgibbon, “Efficient object category recognition using classemes,” in Computer Vision–ECCV 2010, Springer, 2010, pp. 776–789.
  45. L.-J. Li, H. Su, L. Fei-Fei, and E. P. Xing, “Object bank: A high-level image representation for scene classification & semantic feature sparsification,” in Advances in neural information processing systems, 2010, pp. 1378–1386.
  46. D. Borth, R. Ji, T. Chen, T. Breuel, and S.-F. Chang, “Large-scale visual sentiment ontology and detectors using adjective noun pairs,” in Proceedings of the 21st ACM international conference on Multimedia, 2013, pp. 223–232.
  47. J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei, “Imagenet: A large-scale hierarchical image database,” in Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on, 2009, pp. 248–255.
  48. J. Donahue, Y. Jia, O. Vinyals, J. Hoffman, N. Zhang, E. Tzeng, and T. Darrell, “Decaf: A deep convolutional activation feature for generic visual recognition,” ArXiv Prepr. ArXiv13101531, 2013.
  49. A. Vedaldi and K. Lenc, “MatConvNet: Convolutional neural networks for matlab,” in Proceedings of the 23rd Annual ACM Conference on Multimedia Conference, 2015, pp. 689–692.
  50. H. Almuallim and T. G. Dietterich, “Learning With Many Irrelevant Features,” in In Proceedings of the Ninth National Conference on Artificial Intelligence, 1991, pp. 547–552.
  51. L. Yu and H. Liu, “Feature selection for high-dimensional data: A fast correlation-based filter solution.”
  52. M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten, “The WEKA Data Mining Software: An Update,” SIGKDD Explor Newsl, vol. 11, no. 1, pp. 10–18, Nov. 2009.
  53. T. K. Ho, “Random decision forests,” in , Proceedings of the Third International Conference on Document Analysis and Recognition, 1995, 1995, vol. 1, pp. 278–282 vol.1.
  54. S. B. Kotsiantis, “Supervised Machine Learning: A Review of Classification Techniques,” in Proceedings of the 2007 Conference on Emerging Artificial Intelligence Applications in Computer Engineering: Real Word AI Systems with Applications in eHealth, HCI, Information Retrieval and Pervasive Technologies, Amsterdam, The Netherlands, The Netherlands, 2007, pp. 3–24.
  55. V. Campos, B. Jou, and X. Giro-i-Nieto, “From Pixels to Sentiment: Fine-tuning CNNs for Visual Sentiment Prediction,” ArXiv160403489 Cs, Apr. 2016.
  56. D. Borth, T. Chen, R. Ji, and S.-F. Chang, “Sentibank: large-scale ontology and classifiers for detecting sentiment and emotions in visual content,” in Proceedings of the 21st ACM international conference on Multimedia, 2013, pp. 459–460.
  57. “Welcome to ApacheTM Hadoop®!” [Online]. Available: https://hadoop.apache.org/. [Accessed: 30-May-2016].
  58. “Welcome to Apache Flume — Apache Flume.” [Online]. Available: https://flume.apache.org/. [Accessed: 19-Sep-2016].
  59. “Apache Hive TM.” [Online]. Available: https://hive.apache.org/. [Accessed: 19-Sep-2016].
Index Terms

Computer Science
Information Sciences

Keywords

Sentiment analysis citizen sensing opportunistic sensing smart cities big data data warehousing