Ak, BuseGüngör, Tunga2023-03-032023-03-032022-06-01https://tulap.cmpe.boun.edu.tr/handle/20.500.12913/53Tokenization is the process of segmenting a text into tokens. Given a text, the tokenizer identifies the tokens (words, punctuation marks, etc.) within the text and outputs the tokens separately. This process is necessary for applications that work on a per token basis.TurkishTokenizationWord splittingWord segmentationTokenizertoolService