Enhancement of chromatograms, such as the reduction of baseline noise and baseline drift, is often essential to accurately detect and quantify analytes in a mixture. Current methods have been well studied and adopted for decades and have assisted researchers in obtaining reliable results. However, these methods rely on relatively simple statistics of the data (chromatograms) which in some cases result in significant information loss and inaccuracies. In this study, a deep one-dimensional convolutional autoencoder was developed that simultaneously removes baseline noise and baseline drift with minimal information loss, for a large number and great variety of chromatograms. To enable the autoencoder to denoise a chromatogram to be almost, or completely, noise-free, it was trained on data obtained from an implemented chromatogram simulator that generated 190.000 representative simulated chromatograms. The trained autoencoder was then tested and compared to some of the most widely used and well-established denoising methods on testing datasets of tens of thousands of simulated chromatograms; and then further tested and verified on real chromatograms. The results show that the developed autoencoder can successfully remove baseline noise and baseline drift simultaneously with minimal information loss; outperforming methods like Savitzky-Golay smoothing, Gaussian smoothing and wavelet smoothing for baseline noise reduction (root mean squared error of 1.094 mAU compared to 2.074 mAU, 2.394 mAU and 2.199 mAU) and Savitkzy-Golay smoothing combined with asymmetric least-squares or polynomial fitting for baseline noise and baseline drift reduction (root mean absolute error of 1.171 mAU compared to 3.397 mAU and 4.923 mAU). Evidence is presented that autoencoders can be utilized to enhance and correct chromatograms and consequently improve and alleviate downstream data analysis, with the drawback of needing a carefully implemented simulator, that generates realistic chromatograms, to train the autoencoder.