I loaded the Markov chain with data from some of my recent journal entries. Here’s some more content that came out of it:

-=-=-=-
a third slugcat traveling through an unknown region it had marie the
colubrine sector thread to force eject it had a karma indicator i was
a relatively short period of blank chain in
-=-=-=-
a karma gate while the chainValues array when i panicked a slugpup
aside from there! i've read that would add up with
-=-=-=-
that there's a scenario then became a very bad idea
-=-=-=-
pile of them worried among themselves about mobile suit baba could it
turned into a farm arrays variant filled with a cap of them being
actual slugpup aside from there! i've read that the colubrine sector
thread directory adding the code into a variant filled with moving
parts and all files intact phew
-=-=-=-
a player following the form of repetitive rain world scenario then it
earlier today so when i had marie the stick i did some reason napoleon
and another slugcat taking place in Swift I made of a slugpups-per-
player rule instead that limit was exploring the reader to each player
disguised as a little however i had a chance to saint and lafayette but
it had a little however i did some sort of repetitive rain world
scenario then it earlier today so when i reloaded the relationship
between
-=-=-=-

The problem is that it doesn’t have nearly enough data . . .

I finished making the Markov chain that I mentioned in my previous post. I would share it, but I don’t have a good medium by which to share it. However, it works.

sitting in the drink fell and there was peace on
or not the following are the dumpling shaft

comments (single view)

how large is it (in MBs)?

*plugs in the stick*

The file that contains both the testing data and the journal data is 7 kilobytes. And the Markov chain script itself is 4kb.

oof, calling it barely anything is a big understatement

from my experience, 1 MB is still considered small, though I’m not sure if that applies to Markov chains as well

Noted. I'll get to work on expanding the database soon.

View all comments