We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

A System to Analyze an Effect of Privacy Protection on Direct Discrimination

by Asmita Gorave, Vrushali Kulkarni
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 174 - Number 1
Year of Publication: 2017
Authors: Asmita Gorave, Vrushali Kulkarni
10.5120/ijca2017915309

Asmita Gorave, Vrushali Kulkarni . A System to Analyze an Effect of Privacy Protection on Direct Discrimination. International Journal of Computer Applications. 174, 1 ( Sep 2017), 29-33. DOI=10.5120/ijca2017915309

@article{ 10.5120/ijca2017915309,
author = { Asmita Gorave, Vrushali Kulkarni },
title = { A System to Analyze an Effect of Privacy Protection on Direct Discrimination },
journal = { International Journal of Computer Applications },
issue_date = { Sep 2017 },
volume = { 174 },
number = { 1 },
month = { Sep },
year = { 2017 },
issn = { 0975-8887 },
pages = { 29-33 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume174/number1/28373-2017915309/ },
doi = { 10.5120/ijca2017915309 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:21:01.407565+05:30
%A Asmita Gorave
%A Vrushali Kulkarni
%T A System to Analyze an Effect of Privacy Protection on Direct Discrimination
%J International Journal of Computer Applications
%@ 0975-8887
%V 174
%N 1
%P 29-33
%D 2017
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Recently, it is observed that data mining technique comes across two major potential risks from social perspective: discrimination and privacy violation. Discrimination means treating people unfairly, just because they belong to minority group, without taking into account their individual qualification. Data mining technique undergoes risk of discrimination, if data mining tasks are performed using discriminatory dataset. Discrimination Prevention Data Mining is an area, which deals with discovering, preventing and measuring discrimination. Privacy provides right to a person to decide whether to disclose or not to disclose his/her sensitive information. Privacy violation occurs if a person’s sensitive information is disclosed as a result of data mining tasks. Privacy Preserving Data Publishing is an area, which provides methods for publishing useful information while preserving data privacy. Recently, it is identified that these two areas are dependent on each other. So it is important to bridge the research gap between these areas. In this paper, our implemented system is described, which is useful to analyze effect of privacy protection methods on discrimination. Results of our system provide effect of different privacy protection methods on direct discrimination.

References
  1. D. Pedreschi, S. Ruggieri, and F. Turini, “Discrimination-Aware Data Mining,” Proc. 14th ACM Int’l Conf. Knowledge Discovery and Data Mining (KDD ’08), pp. 560-568, 2008.
  2. S. Ruggieri, D. Pedreschi, and F. Turini, “Data Mining for Discrimination Discovery,” ACM Trans. Knowledge Discovery from Data, vol. 4, no. 2, article 9, 2010.
  3. S. Hajian & J. Domingo-Ferrer, “A Methodology for Direct and Indirect Discrimination prevention in data mining,” IEEE transaction on knowledge & data engg. pp. 1445-1459, 2013.
  4. F. Kamiran and T. Calders, “Data preprocessing techniques for classification without discrimination,” Int’l Journal of Knowledge and Information Systems, Springer, Vol. 33, Issue 1, pp. 1-33, 2011.
  5. F. Kamiran, T. Calders, and M. Pechenizkiy, “Discrimination Aware Decision Tree Learning,” Proc. IEEE Int’l Conf. Data Mining (ICDM ’10), pp. 869-874, 2010.
  6. T. Calders and S. Verwer, “Three Naive Bayes Approaches for Discrimination-Free Classification,” Data Mining and Knowledge Discovery, vol. 21, no. 2, pp. 277-292, 2010.
  7. D.Pedreschi, S.Ruggieri and F.Turini, “Measuring Discrimination in Socially-Sensitive Decision Records,” Proc. Ninth SIAM Data Mining Conf. (SDM ’09), pp. 581-592,2009.
  8. R.Agrawal and R.Srikant, ”Privacy-preserving Data Mining”, In Proc. Of the ACM SIGMOD, pp. 439-450, 2000.
  9. Q. Zhang, N. Koudas, D. Srivastava, and T. Yu, ”Aggregate query answering on anonymized tables”, in ICDE, 2007.\
  10. Tiancheng Li, Ninghui Li, Jian Zhang, and Ian Molloy, ”Slicing: A new approach for priavcy preserving data publishing”, IEEE transactions on knowledge and data engineering, vol.24, no.3, 2012.
  11. S Hajian and J. Domingo-Ferrer, “A Study on the Impact of Data Anonymization on Anti-Discrimination,” Proc. I.EEE 12th International Conference on Data Mining Workshops, pp. 352-359, 2012.
  12. B.C.M Fung, K. Wang, R. Chen, P.S.Yu, “Privacy-preserving data publishing: A survey of recent developments,” ACM Comput. Surv. 42(4), Article 14, 2010.
  13. S. Ruggieri, “Data Anonymity Meets Non-Discrimination,” IEEE 13th International Conference on Data Mining Workshops (ICDMW), pp. 875-882, 2013.
  14. S. Hajian, A. Monreale, D. Pedreschi, J. Domingo-Ferrer and F. Ginnotti, “Injecting Discrimination and Privacy Awareness into Pattern Discovery,” Proc. IEEE 12th International Conference on Data Mining Workshops, pp. 360-369, 2012.
  15. S. Hajian, A. Monreale, D. Pedreschi, J. Domingo-Ferrer and F. Ginnotti, “Fair Pattern Discovery,” Proc. 29th Annual ACM Symposium on Applied Computing, pp. 113-120, 2014.
  16. D.Pedreschi, S.Ruggieri, and F. Turini, ”Integrating Induction and Deduction for Finding Evidence of Discrimination”, Proc. 12th ACM Int’l Conf. Artificial Intelligence and Law (ICAIL 09), pp. 157-166, 2009.
  17. R. Kohavi and B.Becker, “UCI Repository of Machine Learning Databases,” http://archive.ics.uci.edu/ml/ datasets/Adult,1996.
  18. D.J. Newman, S. Hettich, C.L. Blake, and C.J. Merz: UCI Repository of Mschine Learning Databases,” http://archive.ics.uci.edu/ml, 1998.
Index Terms

Computer Science
Information Sciences

Keywords

discrimination prevention privacy protection discrimination discovery data anonymization methods