Neural networks are getting smart and their outputs become increasingly realistic. Here is a neural network example created by Jahred Adelman who used the Multi-Layer Recurrent Neural Network (RNN package) designed for character-level language models. The neural net was trained using 600,000+ real paper titles taken from INSPIRE. Then it was used to create "fake" paper titles. Use this link to find which article title is fake and which is real.
S.Chekanov (ANL)