R

Data packages for current and future me

tl; dr I show why it is worthwile to put my Chinese-related datasets in packages and how I went about it. Introduction I don’t know if I’m very late to

Rbootcamp 2019

tl; dr Below you find what we did during the Rbootcamp for Lexical Semanticists. In between this paragraph and the contents, there is a bit of my own #Rstory. Warning,

Tidy collostructions

tl ; dr In this post I look at the family of collexeme analysis methods originated by Gries and Stefanowitsch. Since they use a lot of Base R, and love

Guanguan goes the Chinese Word Segmentation (II)

tl; dr This double blog is first about the opening line of the Book of Odes, and later about how to deal with Chinese word segmentation, and my current implementation

Guanguan goes the Chinese Word Segmentation (I)

tl; dr This double blog is first about the opening line of the Book of Odes, and later about how to deal with Chinese word segmentation, and my current implementation

Mapping the terminology for ideophones

#Goal The goal for this short update is to use the R package lingtypology (click here for the tutorial), in order to create a map that shows which for which