Link to Jupyter Notebook.
The motivation behind this assignment was to gain more comfortable footing with procedurally generating poetry with Python that, at least six or seven out of ten times, produced something worthwhile. Given this standard, my first attempt was a failure, and I’ll be sure to revisit it and reshape it going forward.
What I’d like to do going forward is separate words into parts of speech and distinguish between singular and plural. Generally, I’d like to generate the poetry more Pythonically—more rule based and programmed, with both more and less explicit instructions; so, less
variable + " " + variable, and more actual programming.
The following quote from Charles Hartman (from his Virtual Muse) is highly germane to this kind of work, or at least at this stage in my experience with it, and was lodged in my mind as I was working: “The reader’s mind works most actively on sparse materials. We draw the clearest constellations from the fewest stars. So, the nonsense factor is low for a tiny collocation of words that can be imbued with imagistic significance.”
It is a very fragile business maintaining the balance between that auspicious, even fortuitous zone of sense (or sensible non-sense), and a random, garbled motley of unintentionally assembled words.
The reader is often willing to do the work to fill in the gaps of randomness, ascribing or projecting sense, intention, or is otherwise leaving themselves open to a frisson at the chance congregation of several words. But the program must give the reader something to work with. There is too little intention and definition to my current design.