October 28, 2019

290 words 2 mins read

leod/hncynic

leod/hncynic

Generate Hacker News Comments from Titles

repo name leod/hncynic
repo link https://github.com/leod/hncynic
homepage
language Python
size (curr.) 1868 kB
stars (curr.) 292
created 2019-02-11
license MIT License

hncynic

The best Hacker News comments are written with a complete disregard for the linked article. hncynic is an attempt at capturing this phenomenon by training a model to predict Hacker News comments just from the submission title. More specifically, I trained a Transformer encoder-decoder model on Hacker News data. In my second attempt, I also included data from Wikipedia.

The generated comments are fun to read, but often turn out meaningless or contradictory – see here for some examples generated from recent HN titles.

There is a demo live at https://hncynic.leod.org/ and a twitter bot @hncynic.

A pretrained model together with some instructions may be found at https://hncynic.leod.org/hncynic-trained-model-v1.tar.gz.

Steps

Hacker News

Train a model on Hacker News data only:

  1. data: Prepare the data and extract title-comment pairs from the HN data dump.
  2. train: Train a Transformer translation model on the title-comment pairs using TensorFlow and OpenNMT-tf.

Transfer Learning

Train a model on Wikipedia data, then switch to Hacker News data:

  1. data-wiki: Prepare data from Wikipedia articles.
  2. train-wiki: Train a model to predict Wikipedia section texts from titles.
  3. train-wiki-hn: Continue training on HN data.

Hosting

  1. serve: Serve the model with TensorFlow serving.
  2. ui: Host a web interface for querying the model.
  3. twitter-bot: Run a twitter bot.

Future Work

  • Acquire GCP credits, train for more steps.
  • It’s probably nonideal to use encoder-decoder models. In retrospect, I should have trained a language model instead, on data like title <SEP> comment (see also: GPT-2).
  • I’ve completely excluded HN comments that are replies from the training data. It might be interesting to train on these as well.
comments powered by Disqus