Harsh Trivedi

Google Scholar Semantic Scholar


I am a PhD student in computer science at Stony Brook University. I work in the Language Understanding and Reasoning Lab and am advised by Professor Niranjan Balasubramanian. My broad interests are Natural Language Processing and Machine Learning.


Publications

  • Decomposed Prompting: A Modular Approach for Solving Complex Tasks arxiv
    Tushar Khot, Harsh Trivedi, Matthew Finlayson, Yao Fu, Kyle Richardson, Peter Clark, Ashish Sabharwal
    [paper]
  • Two-Turn Debate Does Not Help Humans Answer Hard Reading-Comprehension Questions workshop
    Alicia Parrish*, Harsh Trivedi*, Nikita Nangia, Vishakh Padmakumar, Jason Phang,
    Amanpreet Singh Saimbhi, Samuel R. Bowman
    NeurIPS ML Safety Workshop at (TACL) 2022
    [paper]
  • Teaching Broad Reasoning Skills for Multi-Step QA by Generating Hard Contexts conference
    Harsh Trivedi, Niranjan Balasubramanian, Tushar Khot, Ashish Sabharwal
    Conference on Empirical Methods in Natural Language Processing (EMNLP), 2022
    [paper] [data]
  • MuSiQue: Multihop Questions via Single-hop Question Composition journal
    Harsh Trivedi, Niranjan Balasubramanian, Tushar Khot, Ashish Sabharwal
    Transactions of the Association for Computational Linguistics (TACL), to be presented in NAACL 2022
    [paper] [code] [bib]
  • Single-Turn Debate Does Not Help Humans Answer Hard Reading-Comprehension Questions workshop
    Alicia Parrish*, Harsh Trivedi*, Ethan Perez, Angelica Chen, Nikita Nangia, Jason Phang, Samuel R. Bowman
    The First Workshop on Learning with Natural Language Supervision at ACL 2022
    [paper] [data]
  • Summarize-then-Answer: Generating Concise Explanations for Multi-hop Reading Comprehension conference
    Naoya Inoue, Harsh Trivedi, Steven Sinha, Niranjan Balasubramanian, Kentaro Inui
    Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
    [paper] [code] [bib]
  • IrEne-viz: Visualizing Energy Consumption of Transformer Models demo
    Yash Kumar Lal, Reetu Singh, Harsh Trivedi, Qingqing Cao, Aruna Balasubramanian, Niranjan Balasubramanian
    Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021
    [paper] [code] [bib]
  • What Ingredients Make for an Effective Crowdsourcing Protocol for Difficult NLU Data Collection Tasks? conference
    Nikita Nangia*, Saku Sugawara*, Harsh Trivedi, Alex Warstadt, Clara Vania, Samuel R. Bowman
    Association for Computational Linguistics (ACL), 2021
    [paper] [code] [bib]
  • IrEne: Interpretable Energy Prediction for Transformers conference
    Qingqing Cao, Yash Kumar Lal, Harsh Trivedi, Aruna Balasubramanian, Niranjan Balasubramanian
    Association for Computational Linguistics (ACL), 2021
    [paper] [code] [bib]
  • Is Multihop QA in DiRe Condition? Measuring and Reducing Disconnected Reasoning conference
    Harsh Trivedi, Niranjan Balasubramanian, Tushar Khot, Ashish Sabharwal
    Conference on Empirical Methods in Natural Language Processing (EMNLP), 2020
    [paper] [video] [code] [slides] [bib]
  • DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering conference
    Qingqing Cao, Harsh Trivedi, Aruna Balasubramanian, Niranjan Balasubramanian
    Association for Computational Linguistics (ACL), 2020
    [paper] [video] [slides] [code] [bib]
  • Repurposing Entailment for Multi-Hop Question Answering Tasks conference
    Harsh Trivedi, Heeyoung Kwon, Tushar Khot, Ashish Sabharwal, Niranjan Balasubramanian
    North American Chapter of the Association for Computational Linguistics (NAACL), 2019
    [paper] [code] [slides] [bib]
  • Controlling Information Aggregation for Complex Question Answering conference
    Heeyoung Kwon, Harsh Trivedi, Peter Jansen, Mihai Surdeanu, Niranjan Balasubramanian
    European Conference on Information Retrieval (ECIR), 2018
    [paper] [poster] [bib]
* Equal contribution.