Welcome! Please contribute your ideas for what challenges we might aspire to solve, changes in our community that can improve machine learning impact, and examples of machine learning projects that have had tangible impact.
Lacking context for this site? Read the original paper: Machine Learning that Matters (ICML 2012). You can also review the slides.
Solicit domain expert "comment" papers on ML work
  • MLJ, JMLR, and other journals could support a new kind of paper submission: a short "comment" paper (1-2 pages) written by a domain expert (e.g., hydrologist, planetary scientist, ecologist) that evaluates specific machine learning work relevant to that domain. These could be solicited by the authors of the original ML paper, or not.

    This would be most effective with a list of points to cover in such a comment paper, e.g.
    1) A short description of the machine learning system
    2) A short description of the problem domain
    3) A summary of how the ML system was deployed in or applied to data from the problem domain
    4) The expert's assessment of:
    4a) the ML system's success/performance (ability to solve the problem)
    4b) the impact of that performance to the problem domain (a lot? a little? paves the way for additional work?)
  • 6 Comments sorted by
  • Interesting.  Maybe these papers should have one ML reviewer and one reviewer from the domain?  Or is that too much overhead?  If it's for an ML audience/venue, maybe ML reviewers are sufficient.
  • I think the domain scientist would need to be a reviewer or we'd be in the same boat we are now.
  • I agree that it would be good to have domain scientists as reviewers (and I've tried a couple of times to recruit colleagues to help look at papers).  But it's hard to find them.  Now you're talking about not just finding qualified ML people (which can be hard enough, as the field grows), but also finding the right qualified domain experts.  Not simple.  Still, I agree that if you want to "vote up" ML papers with real-world impact, you need to get *someone* beyond the ML folks themselves to do evaluation.
  • The angle here would be to get domain experts to write an (evaluative) paper about the ML work -- which might actually be easier than getting them just to review, since this way they get a (minor) publication out of their assessment, too.

    The review question was whether the domain-expert-written paper also needs another domain expert to evaluate it (so it isn't just "a lone voice").  But as you say, finding that person is also hard.
  • Oh, right.  That makes sense.

    Hmmm...  Are you picturing getting the domain expert to evaluate a single piece of ML application work?  Or perhaps a collection of related things.  As a simple example of the latter, you could envision collecting up all the ML-for-biology papers for a year and getting a domain expert to do an evaluative survey of them.  That seems like it might be a more substantive contribution from the DE.  OTOH, it requires that there are a pile of such related papers, and that you can recruit someone to read them.

    (The bio example is not actually a great one, since there are multiple entire conferences and journals dedicated to bio+computation, including ML and ML-related thingies, these days.)
  • Also, on the "evaluate single piece of work" line of thought: IIRC, in some academic communities (stats, I think, and maybe econ), the reviews and response to the reviews become part of the publication, with author names for each.  It gives people a way to get some credit -- they can point to a prominent paper and say, "Look, I wrote an insightful review on that one, and it helped make the paper better."  I don't know how it is tallied in tenure cases, in those communities, but it seems like a mechanism to get credit for hard work done on careful evaluation.

Welcome!

To post or add a comment, please sign in or register.

Tip: click the star icon to bookmark (follow) a discussion. You will receive email notifications of subsequent activity.
If search doesn't work, try putting a + in front of your search term.