{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,6,18]],"date-time":"2025-06-18T04:22:28Z","timestamp":1750220548043,"version":"3.41.0"},"reference-count":0,"publisher":"Association for Computing Machinery (ACM)","issue":"4","license":[{"start":{"date-parts":[[2020,12,14]],"date-time":"2020-12-14T00:00:00Z","timestamp":1607904000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["SIGACT News"],"published-print":{"date-parts":[[2020,12,14]]},"abstract":"<jats:p>On the surface, the concept of preprocessing may not seem particularly novel nor insightful. Indeed, the simple idea that putting in computational effort upfront may pay off in dividends of time saved later is not only advice that we have received countless times but it is also advice that we ourselves have given to countless others. It may well seem a clich\u00b4e, but it turns out to be more powerful than it first appears to be, and in fact plays a rich role in numerous areas of algorithmic research and is a powerful tool for myriad applications. This turns out to be particularly true in Computer Science where pattern repetition within and between algorithms is ubiquitous. The idea of storing results once to avoid repeating them innumerable times can pay huge dividends in computational problems of many kinds. So, while possibly overdone, the advice is still sound and often revelatory.<\/jats:p>","DOI":"10.1145\/3444815.3444819","type":"journal-article","created":{"date-parts":[[2021,1,14]],"date-time":"2021-01-14T23:20:17Z","timestamp":1610666417000},"page":"11-14","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":0,"title":["Review of Kernelization"],"prefix":"10.1145","volume":"51","author":[{"given":"Tim","family":"Jackman","sequence":"first","affiliation":[{"name":"Boston University, Boston, MA, USA"}]},{"given":"Steve","family":"Homer","sequence":"additional","affiliation":[{"name":"Boston University, Boston, MA, USA"}]}],"member":"320","published-online":{"date-parts":[[2021,1,14]]},"container-title":["ACM SIGACT News"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3444815.3444819","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3444815.3444819","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T21:28:12Z","timestamp":1750195692000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3444815.3444819"}},"subtitle":["Theory of Parameterized Preprocessing by Fedor V. Fomin, Daniel Lokshtanov, Saket Saurabh, and Meirav Zehavi"],"short-title":[],"issued":{"date-parts":[[2020,12,14]]},"references-count":0,"journal-issue":{"issue":"4","published-print":{"date-parts":[[2020,12,14]]}},"alternative-id":["10.1145\/3444815.3444819"],"URL":"https:\/\/doi.org\/10.1145\/3444815.3444819","relation":{},"ISSN":["0163-5700"],"issn-type":[{"type":"print","value":"0163-5700"}],"subject":[],"published":{"date-parts":[[2020,12,14]]},"assertion":[{"value":"2021-01-14","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}