'Trouble with japanese txt file and wordcloud output in python
I referenced amueller's word cloud and a few others. It came to something like this:
#!c:/Python27/python.exe
# coding: UTF-8
from os import path
from wordcloud import WordCloud
import MeCab as mc
d = path.dirname("C:\Users\BobLeponge\Desktop\jpn\JPNTEXT.txt")
text = open(path.join(d, 'JPNTEXT.txt')).read()
text = text.decode("utf-8")
def mecab_analysis(text):
t = mc.Tagger('-Ochasen -d/usr/local/Cellar/mecab/0.996/lib/mecab/dic/mecab-ipadic-neologd/')
enc_text = text.encode('utf-8')
node = t.parseToNode(enc_text)
output = []
while(node):
if node.surface != "":
word_type = node.feature.split(",")[0]
if word_type in ["形容詞", "動詞","名詞", "副詞"]:
output.append(node.surface)
node = node.next
if node is None:
break
return output
def create_wordcloud(text):
fpath = "C:\WINDOWS\Fonts\NotoSansMonoCJKjp-Regular.otf"
stop_words = [ u'てる', u'いる', u'なる', u'れる', u'する', u'ある', u'こと', u'これ', u'さん', u'して', \
u'くれる', u'やる', u'くださる', u'そう', u'せる', u'した', u'思う', \
u'それ', u'ここ', u'ちゃん', u'くん', u'', u'て',u'に',u'を',u'は',u'の', u'が', u'と', u'た', u'し', u'で', \
u'ない', u'も', u'な', u'い', u'か', u'ので', u'よう', u'']
wordcloud = WordCloud(background_color="white",font_path=fpath, width=900, height=500, \
stopwords=set(stop_words)).generate(text)
import matplotlib.pyplot as plt
wordcloud = WordCloud(background_color="white", width=900, height=500).generate(text)
plt.figure()
plt.imshow(wordcloud, interpolation="bilinear")
plt.axis("off")
plt.show()
When I execute it, this image pops up
I'm still a beginner at python so I'm not really sure what's wrong. I checked the encoding on the text file and it was utf-8, and # coding: UTF-8 is at the beginning of the code so I assumed it would work. What am I doing wrong here?
Solution 1:[1]
One of the most important things to consider is if your ORM supports composite primary keys, which are very common to use in Citus. GORM suppots these: https://gorm.io/docs/composite_primary_key.html, so I don't see any reason why it would not work.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |

