KDnuggets Analytics, Big Data, Data Mining, and Data Science
Subscribe to KDnuggets News
SOFTWARE | News | Top stories | Opinions | Jobs | Companies | Courses | Datasets | Education | Meetings | Polls | Webcasts

KDnuggets Home » Polls » Current
Poll: Current
When bulding Machine Learning / Data Science models in 2018, how often was it important that the model be humanly understandable/explainable? [1080 votes total]

Always (197)
Frequently (259)
Rarely (72)
Never (11)
Employment: Company or (373)
Government/ non-profit (26)
Academia/ University (52)
Student (80)
Other (10)


Total Comments 2 | Start A New Comment
Post Info Comment
Posted By: Scott H. Jackson

Posted On: 10 days ago
Views: 40
clarity for organizational "buy-in"

In organizations where there's no established research or data analysis culture, people can be dismissive of new approaches if they don't understand what's going on in a new model. This is especially the case if the new models imply that the previous way of doing things is inefficient or ineffective. For people to be able to buy in to the new approach, they need to be able to understand it.

Posted By: Gregory Piatetsky, Editor

Posted On: October 24th
Views: 1637

for careful observers: the percentages are half of what they should be, but it is not an error, but a hack. I am using this one-question poll to ask 2 questions. Will fix percentages when final results are published.


KDnuggets Home » Polls » Current

Copyright © 2017 KDnuggets.