python,python外星人入侵游戏戏,Pycharm,pygame,写完了,但是有bug,希望帮我找一下原因

&p&&b&男生(也包括一部分女生)对[大胸]这一概念的臆想。&/b&&/p&&br&&img src=&/v2-01ceef66dbd_b.png& data-rawwidth=&635& data-rawheight=&86& class=&origin_image zh-lightbox-thumb& width=&635& data-original=&/v2-01ceef66dbd_r.png&&&a href=&/question/& class=&internal&&为什么数据显示中国女人平均A罩杯,但身边和网络上那么多人的胸都很大? - 知乎&/a&&br&&br&&br&&p&&b&新鲜出炉的:(截图来自知乎答案区)&/b&&/p&&br&&p&&b&一、C杯就可以算比较大了,D杯那是妥妥的大胸,E、F的话巨乳没跑,如果是G……啥?还有G?那肯定大到眼睛都装不下了吧!&/b&&/p&&br&&img src=&/v2-f3bf102db_b.png& data-rawwidth=&652& data-rawheight=&54& class=&origin_image zh-lightbox-thumb& width=&652& data-original=&/v2-f3bf102db_r.png&&&br&&img src=&/v2-eb2fd5fcbb8de3a96f9a4_b.png& data-rawwidth=&324& data-rawheight=&51& class=&content_image& width=&324&&&br&&a href=&///?target=http%3A///documents/17175& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&不同罩杯的胸到底有多重?看这个对照表就懂了... - 暴走日报 全球热点奇闻&i class=&icon-external&&&/i&&/a&&br&&p&&b&二、中国女生全世界胸最小,都是A或A-,美国妹子胸大,她们的A杯和中国的B、C差不多大。&/b&&/p&&br&&img src=&/v2-6211faf6b2e5bac639ecfe_b.png& data-rawwidth=&333& data-rawheight=&43& class=&content_image& width=&333&&&br&&img src=&/v2-ec1c71c22f9e2e47066abfddc83a943a_b.png& data-rawwidth=&215& data-rawheight=&36& class=&content_image& width=&215&&&br&&img src=&/v2-fe14f50d_b.png& data-rawwidth=&557& data-rawheight=&37& class=&origin_image zh-lightbox-thumb& width=&557& data-original=&/v2-fe14f50d_r.png&&&br&&p&&b&三、如果中国女生自称有D、E、F、G……那她们一定在吹牛x,要么就都是用海绵垫的。&/b&&/p&&br&&img src=&/v2-40eecdaec79f7e7f4d9828_b.png& data-rawwidth=&506& data-rawheight=&35& class=&origin_image zh-lightbox-thumb& width=&506& data-original=&/v2-40eecdaec79f7e7f4d9828_r.png&&&br&&img src=&/v2-e0d951a51f8404935fcdbe5b_b.png& data-rawwidth=&231& data-rawheight=&33& class=&content_image& width=&231&&&br&&br&&br&&p&事实真的如此吗?&/p&&p&并不。&/p&&br&&p&罩杯对应的字母,是按照上下胸围差来算的。同样的围差,放在瘦妹子和胖妹子身上,视觉效果完全不同。一个下胸围65cm的女生,如果上胸围85cm,围差20,穿E罩杯,看上去并不会觉得胸很大。而一个下胸围85cm的女生,围差20,也穿E罩杯,上胸围则有105cm,这位才会让人感觉胸大。同理,C杯不太可能大到吓人,75B更不会是大胸,也不存在“double B”这种明显不懂内衣的人幻想出来的尺码。看下图,G完全没有想象那么大对不对?&/p&&img src=&/v2-220a4fbebc3f9a9a1cc65_b.png& data-rawwidth=&600& data-rawheight=&638& class=&origin_image zh-lightbox-thumb& width=&600& data-original=&/v2-220a4fbebc3f9a9a1cc65_r.png&&&br&&p&美国的A和中国的A没什么差别,由于计算方式不同,美码可能有DD、FF杯,所以大罩杯会与中国有出入,但D杯以下没什么差别,不存在美国A相当于中国的B、C,就算为了黑中国女生胸小,这种方法也太弱鸡了!答主穿中国码70D(32D),随手找一件自己的Mimi Holliday,来自米国,美码32D,尺码刚好合适,这种单层围也不存在加插垫的可能性。so,用事实说话不好吗?&/p&&img src=&/v2-4d87a8b603cdcc67e6e0728_b.jpg& data-rawwidth=&1082& data-rawheight=&1500& class=&origin_image zh-lightbox-thumb& width=&1082& data-original=&/v2-4d87a8b603cdcc67e6e0728_r.jpg&&&br&&p&说中国女生的大罩杯都是吹出来的,更是没见过世面+不懂常识的表现。中国妹子和美国妹子相比,普遍体型更纤细,也就是小底围更多,而小底围大罩杯往往不等于视觉上的大胸,也就是说,字母大和看起来胸大,是两个概念。另外,中国的内衣领域发展严重滞后,尺码极其不完善,审美观念也比较奇葩,很多女生只能凑合着把自己塞进小内衣中,顺便还能制造刷卡沟效果,这又导致相当一部分女生购买的内衣尺码和她们的真实胸围并不对应。&/p&&img src=&/v2-dcfc10c074bf9ffd42d112_b.jpg& data-rawwidth=&640& data-rawheight=&1107& class=&origin_image zh-lightbox-thumb& width=&640& data-original=&/v2-dcfc10c074bf9ffd42d112_r.jpg&&&br&&br&&p&&b&所以,真的不要幻想D就应该是大胸,F一定是巨乳,中国女生胸前都没有篮球大她们一定在装x……多懂一点常识没有坏处,至少可以让你避免像厌女症一样上网喷中国女生胸小、凑表脸吹牛x啊!&/b&&/p&&p&&b&毕竟,罩杯这玩意有尺码换算表摆在那里,而懂内衣的女生都知道罩杯字母大不意味着胸部乳量大。某些人何苦抱着“东宫娘娘烙大饼”的幻想,把中国女生钉在胸小+装x的“耻辱柱”上,还怪人家没满足你的异想天开呢?&/b&&/p&&p&&b&熏疼这种智商5秒诶……&/b&&/p&
男生(也包括一部分女生)对[大胸]这一概念的臆想。
新鲜出炉的:(截图来自知乎答案区) 一、C杯就可以算比较大了,D杯那是妥妥的大胸,E、F的话巨乳没跑,如果是G……啥?…
&p&从0到1,是最艰难吃力的一步。&/p&&br&&p&我15年的时候,在其它论坛发过一篇帖子,也是零基础开始学Python的。阅读量挺高的。&/p&&p&希望对题主有帮助。&/p&&br&&p&原文地址是 &a href=&///?target=http%3A//cocode.cc/t/python-flask-sqlalchemy/1884& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&Python+flask+SQLAlchemy 半年自学~跳槽成功~&i class=&icon-external&&&/i&&/a& (排版比知乎好。&/p&&br&&p&&b&使用前说明:&/b&&/p&&p&
本人大学专业是CS,大学的时候编程很差,毕业后并没有做程序开发的工作,但是大学听课很认真(拿了奖学金和一些其他名誉),所以各种语言的语法基础还有其他CS必修课还是有点印象的。半年前才第一次接触python,然后打开了新世界的大门。&/p&&p&
前不久去找工作,很高兴拿到了几个offer ,已经选择了自己超超超超级喜欢的一个,刚吃完酸菜鱼,心情比较好,决定分享一下自己的学习经验和路线,希望对其他人也有所帮助。&/p&&p&
不过因为本人还比较菜,所以要为加入新公司做各种功课,以免自己开车太慢导致后面塞车。所以时间不多,这个帖子的内容会比较粗略梗概。如果大家有什么问题,我会尽量回答,但是不保证自己有空和有能力可以解答。仅供参考(对,这就是免责申明:D &/p&&p&&b&下面是正文:&/b&&/p&&p&总体路线:
Python+Pygame+Flask+SQLAlchemy
&/p&&p&主要书籍:&/p&&p&1.笨方法学Python (电子版)&/p&&p&2.Python 核心编程(电子版)&/p&&p&3. Python基础教程(纸质版)&/p&&p&4. FlaskWeb开发:基于Python的Web应用开发实战(电子版)
&/p&&p&主要的参考网站: &/p&&p&1. Assignments — Problem Solving with Algorithms and Data Structures
&a href=&///?target=http%3A//interactivepython.org/runestone/static/pythonds/index.html& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&http://interactivepython.org/runestone/static/pythonds/index.html666&i class=&icon-external&&&/i&&/a& &/p&&p&2. SQLite – Python | w3cschool菜鸟教程
&a href=&///?target=http%3A//www.w3cschool.cc/sqlite/sqlite-python.html& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&http://www.w3cschool.cc/sqlite/sqlite-python.html631&i class=&icon-external&&&/i&&/a& &/p&&p&3. Python / Python 编码风格指南中译版(Google SOC) | Elias的个人主页
&a href=&///?target=http%3A///Python/PythonStyleGuide%3Ffrom%3DDevelop.PythonStyleGuide%23toc32& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/Python/PythonStyleGuide?from=Develop.PythonStyleGuide#toc32321&i class=&icon-external&&&/i&&/a& &/p&&p&4.Python 的神奇方法指南 - 开源中国社区
&a href=&///?target=http%3A//www.oschina.net/question/412& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&http://www.oschina.net/question/412470&i class=&icon-external&&&/i&&/a& &/p&&p&4.用Python和Pygame写游戏-从入门到精通(目录) | 目光博客
&a href=&///?target=http%3A//eyehere.net/2011/python-pygame-novice-professional-index/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&http://eyehere.net/2011/python-pygame-novice-professional-index/414&i class=&icon-external&&&/i&&/a& &/p&&p&5.欢迎进入Flask大型教程项目! — flask mega-tutorial 1.0 documentation
&a href=&///?target=http%3A///flask-mega-tutorial/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/flask-mega-tutorial/724&i class=&icon-external&&&/i&&/a& &/p&&p&6.欢迎使用 Flask — Flask 0.10.1 documentation
&a href=&///?target=http%3A///flask/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/flask/372&i class=&icon-external&&&/i&&/a& &/p&&br&&p&7.扩展包来源:
&a href=&///?target=http%3A//www.lfd.uci.edu/%257Egohlke/pythonlibs/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&http://www.lfd.uci.edu/~gohlke/pythonlibs/60&i class=&icon-external&&&/i&&/a& &a href=&///?target=http%3A///winpy_libs& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/winpy_libs43&i class=&icon-external&&&/i&&/a& &/p&&br&&p&8.其他:
25本免费的Python电子书 - 博客 - 伯乐在线
&a href=&///?target=http%3A///29281/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/&i class=&icon-external&&&/i&&/a& &/p&&p&9.9本免费的Python编程书 - 博客 - 伯乐在线
&a href=&///?target=http%3A///765/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/765/392&i class=&icon-external&&&/i&&/a& &/p&&p&本人的github: &a href=&///?target=https%3A///dodoru& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&/dodoru&/span&&span class=&invisible&&&/span&&i class=&icon-external&&&/i&&/a&&/p&&p&可以看到我的很多练习代码还有我fork 来学习的代码。 很多地方有瑕疵,大家看看了解我当时的学习进度和水平就好,(?﹏?),凭良心说,我的代码不是很好,所以并不是用来模仿学习的好对象。
&/p&&p&学习安排
&/p&&p&注:本人是业余时间,一般晚上8:00 - 12:00 和周末学习代码的,经常会有一些其他杂事,所以只能担保每天至少两小时,大家可按照自己的咸鱼时间加快速度。
&/p&&br&&p&&b&第一个月: 基础篇&/b&
(3.12 - 4.12)
&/p&&p&&a href=&///?target=https%3A///dodoru/learn_in_python& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/dodoru/learn_in_python1.1K&i class=&icon-external&&&/i&&/a& &/p&&p&第一周:安装python2.7 ,利用笨方法学python 练习基本语法
&/p&&p&第二周:被推荐使用pycharm,
超级好用,强烈推荐,本人在默认设置里把制表符Tab 改成了四个空格;然后训练写了堆栈
&/p&&p&第三周:训练基本的数据结构,自己写链表和队列
&/p&&p&第四周:继续把笨方法学python 差不多敲完&/p&&br&&p&&b&第二个月:加强篇&/b&
(4.12 - 5.12 )
&/p&&p&第一周:因为记忆力比较差的缘故,我又重写了一次基本的堆栈队列和链表。
&/p&&p&第二周:看python 核心编程《基础篇》。
&/p&&p&第三周:继续看python 核心编程《基础篇》,简单写斗兽棋的程序(失败)。
&/p&&p&第四周:继续看python 核心编程《基础篇》,继续修改斗兽棋(失败),掉头写五子棋,开始接触pygame。
&/p&&p&注:接下来5/6月,因为在岗工作(计算机相关工作,但不是编程开发)很忙,所以没有很多时间写代码,进度会偏慢.这期间学习的pygame 是为了锻炼基本的编程技能,还有培养自己对python 编程的乐趣。 如果你本省具有比较好的编程基础也对编程早早具有强烈的爱好,可以跳过两个月的pygame.
&/p&&p&&b&第三个月: Pygame A &/b&
(5.12 - 6.12)
&a href=&///?target=https%3A///dodoru/LearnPygame& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/dodoru/LearnPygame125&i class=&icon-external&&&/i&&/a& &/p&&p&第一周:继续写五子棋,然后开始用pygame 开始写打砖块 breakblock。
&/p&&p&第二周:继续写打砖块,为了存储游戏的数据信息,中途学了一点点json。
&/p&&p&第三周:写好打砖块,获得成就感。电力效果 MAX。
&/p&&p&第四周:抄别人的Pygame 音乐播放器,失败。因为别人的音乐播放器中 含有manage.py 暂时无法理解。所以跳过去了。
&/p&&p&&b&第四个月:Pygame B
Flask (初步)
(6.12-7.12)
&/p&&p&第一周: 抄别人的飞机大战,学会使用音乐各种创造游戏效果(然并卵,只能培养乐趣)。 学习一点点turtle 用来画图(&a href=&///?target=https%3A//docs.python.org/2/library/turtle.html& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&docs.python.org/2/libra&/span&&span class=&invisible&&ry/turtle.html&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&) (然并卵,只能培养乐趣)。
&/p&&p&第二周:还在抄别人的pygame 代码(然并卵,所以被朋友责骂了,毕竟我学习python的目标还是要出去找工作的,而不是用来自娱自乐的。不过这个时候,我已经开始对python 情根深种了)。
&/p&&p&第三周: 开始看Flask 的那本薄薄的书,第一天快速浏览书的目录结构,第二天开始往后看,然后到了模板引擎那里死掉了。
&/p&&p&第四周:又看了一次Flask 的书,还是到了第四章就死掉。跑去看python核心编程《高级篇》前半部,我得静静心,因为半个月毫无进展很挫败;
&/p&&p&注:至此,学习python
遇到了最大的挫败感,我看一本书,来去看,每个字都能看懂,但是盖了书,我还是什么都不懂。甚至对着书本抄代码,运行起来也不对,仅仅因为我没有理解
templates 这个默认路径辨别的 文件夹是什么意思。我简直要疯掉了!!!想死的心都有了。不过接下来的两个月超级充实。大家要有被艹( ▼-▼
)的心理准备。&/p&&p&第五个月:Flask
(7.12-8.12) &/p&&p&&a href=&///?target=https%3A///dodoru/learn_in_python& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&/dodoru/learn_in_python1.1K&i class=&icon-external&&&/i&&/a& &/p&&p&第一周:对着FLask 中文学习网站从头到尾敲,不懂就看第三第四次查资料问人厚着脸皮求帮助。然后顺便学习了一点http 协议 和 get
post ,socket等基本网络编程知识。&/p&&p&第二周:突然开窍,继续对着Flask 的中文学习网站敲,一直敲到了能够用txt作为简单存储文件来设计网站。&/p&&p&第三周:把txt 改为 sqlite3, 熟悉SQL语句,初步学习SQLAlchemy&/p&&p&第四周:学习cookie 和 session 等&/p&&p&第六个月:Flask+SQLAlchemy (8.12-9.12)&/p&&p&第一周: 练习 SQLAlchemy (&a href=&///?target=https%3A///dodoru/flask_todo& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&/dodoru/flask&/span&&span class=&invisible&&_todo&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&),开始修改原来用sqlite3 存储数据的demo网站;&/p&&p&第二周:利用flask-mail 增加发送用户密码的功能,网站功能基本OK,打包成Lilium(&a href=&///?target=https%3A///dodoru/Lilium& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&/dodoru/Liliu&/span&&span class=&invisible&&m&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&),继续狂砍书,学会blueprint。&/p&&p&第三周:继续狂看书,然后抄他们的大型博客(修改未完成)。然后写简历。&/p&&p&第四周:找工作。
---over---
&/p&&p&&b&剧场时间:&/b&&/p&&p&9月1日开始找工作,挑了8家看着很喜欢的公司投递简历,拿到五个面试通知,刚好堆积在三天内,9月5日专门请假去面试(单程7个小时,自费,我本来以为会没人要我的,所以第一轮是试水),结果没想到,基本上都很顺利,同一天拿到了三个口头offer,虽然不是特别好(毕竟不是特别厉害),但是也是可以在一线城市勉强活下来了。&/p&&p&然后回来等email offer , 心里很开心也有一定的纠结 ,没等到,有点怕,又投了几个简历。
然后碰到了我超超超超级喜欢的一个创业公司,在电话面试和机试通过后让我过去,临去之前查看公司背景资料(现在还在后悔),被吓到了,结果就一直抖(到了现在还在抖),脑袋一片空白。也不知道自己回答了什么,所有的脑细胞都变成离散的颗粒,完全无法启动,所以很混乱。但是技术负责人超级nice
还是给了我一个offer ,(可以足够让我在那个城市里活下去了) ,
当天回家路上我发呆了三个小时后又哭又笑又失眠又早醒,简直要疯了,所以第二天早上矜持不到四个小时,就接受了。&/p&&p&下午又有收到一个电话offer :D 不过当场毫无负担地拒绝掉了,然后两天后,开始主动回复那些给我邮件offer 和口头offer
的公司,免得互相耽误,听说口头offer
如果不回复也可以,但是我总觉得如果不回复就流程变成僵尸进程,心理不舒服,回复完之后,直接结束进程特别舒爽。&/p&&p&之后到现在还是在做梦的状态,现在在学 js 和 jquery 还有 scrapy. :
&/p&&p&&b&最后&/b&
能够在半年内快速学习python+flask+SQLAlchemy 仅仅靠个人闭门造车是不可能的。
用脚趾头都能猜出我肯定有搜索各种资料和问题,潜水在几个技术群和博客里默默围观的。
为了避免被人肉,就不自爆技术群了。这是我在知乎的回答,是讲怎么向大牛们请教问题的:&/p&&p&&b&如何向领域内的大牛求助,有什么注意事项?&/b&
我觉得最重要是态度,动机,坦诚,水平。
求学的态度要好,厚脸皮,人家帮你是善良,不理你是正常,所以愿意指点就应该感恩了。
动机要纯良,不要恶意满满,不要故意挖坑给对方跳,不要利用小聪明或者咬文嚼字故意曲解对方的话语作为把柄。基本上,大牛门见多识广,大多会判定对方是有心向学还是故意找茬,偶尔发生误判也是为了自我保护。
坦诚大概是我身上为数不多的闪光点,自己是怎么想的,是不是真的懂,有没有学过,有没有接触过,做过哪些尝试和推论,前因后果,如果对方问,就坦诚地说。绝对不要不懂装懂。
水平就是问之前至少搜一搜看看有没有类似的问题,就我来说,大部分遇到的问题都不是我一个人的问题。所以问之前,先搜索看看别人怎么说的,如果不理解或者找不到合理的解释,就可以发问了。
&a href=&/question//answer/& class=&internal&&/question//answer/&/a&
祝我幸福快乐,早成小牛,请保佑我:D
the same to you.&/p&&p&——————————2015年9月————————————
&/p&&br&&br&&br&&br&&br&&br&&br&&p&有很长一段时间,我看到这篇文章,会很难受。&/p&&p&我当时对小牛的标准是很低的,那时候我在井底,觉得跳出去井口,就成为小牛。结果,当我跳出井口,发现世界没有我想象那么美好,我在这世界丛林中又太卑微。&/p&&p&现在,我没有幸福快乐。&/p&&p&但是,对过去感到自豪羞愧难受唯独不后悔。&/p&&p&—————————— 2017年6月 ————————————&/p&
从0到1,是最艰难吃力的一步。 我15年的时候,在其它论坛发过一篇帖子,也是零基础开始学Python的。阅读量挺高的。希望对题主有帮助。 原文地址是
(排版比知乎好。 使用前说明: 本人大学专业是CS,大学的时候…
&img src=&/v2-109d9a05b3ba3fcfcf7cde9_b.jpg& data-rawwidth=&1200& data-rawheight=&551& class=&origin_image zh-lightbox-thumb& width=&1200& data-original=&/v2-109d9a05b3ba3fcfcf7cde9_r.jpg&&&p&问耕 发自 LZYY&br&量子位 报道 | 公众号 QbitAI&/p&&p&人工智能又来拯救你的涂鸦大作了!&/p&&p&今年2月底,量子位介绍过一个“分分钟画只猫”的AI应用,如今三个月过去,还是相同的配方、还是熟悉的味道,只不过这次变成画人脸。&/p&&p&量子位玩了一会儿,简直再次上瘾~&/p&&p&方法很简单,你信手画一个人脸,儿童简笔画水准也好,素描大师水准也好,只要勾勒一个大致的人脸轮廓,然后点击一下中间的红色按钮,神奇的事情就发生了!&/p&&p&你的画作会被AI完善成一幅艺术大作!&/p&&p&&img src=&/v2-78596fe50eaf37a801bced40_b.jpg& data-rawwidth=&540& data-rawheight=&256& data-thumbnail=&/v2-78596fe50eaf37a801bced40_b.jpg& class=&origin_image zh-lightbox-thumb& width=&540& data-original=&/v2-78596fe50eaf37a801bced40_r.gif&&上面这个动图就是全程的生动说明。&/p&&p&下面量子位再贴几张其他大师的作品,大家感受一下。&/p&&img src=&/v2-6ac3d6aa1f7f8a68ffadf6e_b.jpg& data-rawwidth=&1200& data-rawheight=&674& class=&origin_image zh-lightbox-thumb& width=&1200& data-original=&/v2-6ac3d6aa1f7f8a68ffadf6e_r.jpg&&&p&&img src=&/v2-109d9a05b3ba3fcfcf7cde9_b.jpg& data-rawwidth=&1200& data-rawheight=&551& class=&origin_image zh-lightbox-thumb& width=&1200& data-original=&/v2-109d9a05b3ba3fcfcf7cde9_r.jpg&&&img src=&/v2-5ba46d960a9c67df83b141961ccb5264_b.jpg& data-rawwidth=&1200& data-rawheight=&551& class=&origin_image zh-lightbox-thumb& width=&1200& data-original=&/v2-5ba46d960a9c67df83b141961ccb5264_r.jpg&&&img src=&/v2-8ba3a20fac_b.jpg& data-rawwidth=&1200& data-rawheight=&660& class=&origin_image zh-lightbox-thumb& width=&1200& data-original=&/v2-8ba3a20fac_r.jpg&&灵魂画手们心动了没?想不想试一试?传送门在此:&/p&&a href=&/?target=http%3A//fotogenerator.npocloud.nl/& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&fotogenerator.npocloud.nl&/span&&span class=&invisible&&/&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&&p&这个画脸的应用来自荷兰,欧洲那个荷兰。这个人工智能网络使用了200张Lara Rense女士的照片训练而成。&/p&&p&据说只上线测试两周~&/p&&h1&背后技术&/h1&&p&和之前的画猫一样,这个AI应用背后的技术,也是基于有条件对抗网络的图像到图像翻译。这个网络不仅能学习从输入图像到输出图像的映射,而且还能学习一个损失函数来训练这一映射。&/p&&p&这使得传统上可能需要不同损失函数的问题,能够应用相同的通用方法。也就是说能在不必手动设计损失函数的情况下实现合理的结果。&/p&&p&这种方法被证明在以标签图合成照片、从边缘图重建对象、图像着色等任务中有效。&/p&&p&&img src=&/v2-baee377fe9c15a1b1820fcaf0da34037_b.jpg& data-rawwidth=&1280& data-rawheight=&471& class=&origin_image zh-lightbox-thumb& width=&1280& data-original=&/v2-baee377fe9c15a1b1820fcaf0da34037_r.jpg&&这项研究来自加州大学伯克利分校的Phillip Isola、Jun-Yan Zhu、Tinghui Zhou以及Alexei A. Efros。他们还公布了相关的论文和代码。&/p&&p&论文传送门&/p&&a href=&/?target=https%3A//arxiv.org/pdf/v1.pdf& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&arxiv.org/pdf/&/span&&span class=&invisible&&4v1.pdf&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&&p&代码传送门&/p&&ul&&li&Torch版:&br&&a href=&/?target=https%3A///phillipi/pix2pix& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&phillipi/pix2pix&i class=&icon-external&&&/i&&/a&&/li&&br&&li&PyTorch版:&br&&a href=&/?target=https%3A///junyanz/pytorch-CycleGAN-and-pix2pix& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&junyanz/pytorch-CycleGAN-and-pix2pix&i class=&icon-external&&&/i&&/a&&br&&/li&&li&TensorFlow版:&br&&a href=&/?target=https%3A///affinelayer/pix2pix-tensorflow& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&affinelayer/pix2pix-tensorflow&i class=&icon-external&&&/i&&/a&&br&&a href=&/?target=https%3A///yenchenlin/pix2pix-tensorflow& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&yenchenlin/pix2pix-tensorflow&i class=&icon-external&&&/i&&/a&&br&&/li&&li&Chainer版:&br&&a href=&/?target=https%3A///pfnet-research/chainer-pix2pix& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&pfnet-research/chainer-pix2pix&i class=&icon-external&&&/i&&/a&&br&&/li&&li&Keras版:&br&&a href=&/?target=https%3A///tdeboissiere/DeepLearningImplementations/tree/master/pix2pix& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&tdeboissiere/DeepLearningImplementations&i class=&icon-external&&&/i&&/a&&br&&/li&&/ul&&h1&更多Demo&/h1&&p&基于上面提到的Paper和Code,还要很多非常有意思的Demo可玩。除了画人脸、画小猫,还能画房子、画鞋子……&/p&&p&&img src=&/v2-a2d891f641c7a3f7306eae0_b.jpg& data-rawwidth=&1630& data-rawheight=&740& class=&origin_image zh-lightbox-thumb& width=&1630& data-original=&/v2-a2d891f641c7a3f7306eae0_r.jpg&&&img src=&/v2-12dc914d7aefa23a907cbe_b.jpg& data-rawwidth=&1618& data-rawheight=&718& class=&origin_image zh-lightbox-thumb& width=&1618& data-original=&/v2-12dc914d7aefa23a907cbe_r.jpg&&地址在这里:&a href=&/?target=http%3A///pixsrv/index.html& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&Image-to-Image Demo - Affine Layer&i class=&icon-external&&&/i&&/a&&/p&&p&以及,当时画猫的成就你们还记得么?&/p&&p&&img src=&/v2-dbe6180a07abab_b.jpg& data-rawwidth=&1618& data-rawheight=&732& class=&origin_image zh-lightbox-thumb& width=&1618& data-original=&/v2-dbe6180a07abab_r.jpg&&&img src=&/v2-0f2fba15befdb41b6d32_b.jpg& data-rawwidth=&640& data-rawheight=&344& class=&origin_image zh-lightbox-thumb& width=&640& data-original=&/v2-0f2fba15befdb41b6d32_r.jpg&&&img src=&/v2-ce52ade828e75e126e12fcbc42304cfa_b.jpg& data-rawwidth=&640& data-rawheight=&294& class=&origin_image zh-lightbox-thumb& width=&640& data-original=&/v2-ce52ade828e75e126e12fcbc42304cfa_r.jpg&&&img src=&/v2-81cb14b56b15f3f5d43d8a_b.jpg& data-rawwidth=&500& data-rawheight=&230& data-thumbnail=&/v2-81cb14b56b15f3f5d43d8a_b.jpg& class=&origin_image zh-lightbox-thumb& width=&500& data-original=&/v2-81cb14b56b15f3f5d43d8a_r.gif&&再次召唤灵魂画手们,一起来画人脸吧!&/p&&p&再次放出画人脸的传送门:&/p&&a href=&/?target=http%3A//fotogenerator.npocloud.nl/& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&fotogenerator.npocloud.nl&/span&&span class=&invisible&&/&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&&p&瘾招易!&/p&&p&【完】&/p&&p&&b&One More Thing…&/b&&/p&&p&今天AI界还有哪些事值得关注?在量子位(QbitAI)公众号对话界面回复“今天”,看我们全网搜罗的AI行业和研究动态。笔芯~&/p&
问耕 发自 LZYY 量子位 报道 | 公众号 QbitAI人工智能又来拯救你的涂鸦大作了!今年2月底,量子位介绍过一个“分分钟画只猫”的AI应用,如今三个月过去,还是相同的配方、还是熟悉的味道,只不过这次变成画人脸。量子位玩了一会儿,简直再次上瘾~方法很简单…
&img src=&/v2-fb852a6f2f851d40ec3d98ec_b.png& data-rawwidth=&600& data-rawheight=&337& class=&origin_image zh-lightbox-thumb& width=&600& data-original=&/v2-fb852a6f2f851d40ec3d98ec_r.png&&&h2&前言&/h2&&p&前段时间内,Google在TensorFlow Dev Summit大会上吹了一帮使用TensorFlow做机器学习,说是仿照scikit-learn的api来写的,看着很有诱惑性&br&&img src=&/v2-fb852a6f2f851d40ec3d98ec_b.png& data-rawwidth=&600& data-rawheight=&337& class=&origin_image zh-lightbox-thumb& width=&600& data-original=&/v2-fb852a6f2f851d40ec3d98ec_r.png&&有一些算法可能官方文档里面没有,但是官方仓库里面是都有代码的,比如GMM和WALS:&a href=&/?target=https%3A///tensorflow/tensorflow/tree/master/tensorflow/contrib/factorization/python/ops& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&contrib/factorization/python/ops&i class=&icon-external&&&/i&&/a&.&br&第一弹,我们介绍基本的分类模型的使用,会主要介绍LinearClassifier\SVM\Random Forest\wide and deep,会由浅至深在每个算法中分别讲述需要注意的点。&/p&&h2&LinearClassifier&/h2&&h3&数据集描述&/h3&&p&先描述下这里做实验的数据集,下载地址&a href=&/?target=http%3A//archive.ics.uci.edu/ml/machine-learning-databases/census-income-mld/& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&census income mld&i class=&icon-external&&&/i&&/a&,是收集美帝的收入的记录,feature是一些个人信息包括工作、年纪、学历水平、家庭情况等等大概40个维度,标签是是否年收入在50k以上,即一个二类分类器。后面所有算法都是使用的相同的数据来做实验。&/p&&h3&数据读入&/h3&&p&数据集格式为csv,使用pandas可以快速读入数据,并格式化为DataFrame,做一些基本的预处理操作,如下,是从csv文件中读入数据的操作,因为Pandas内部会自动判断类型为object类型(categorical 值为02,40这类数字的值),在使用之前需要做转换,转为str类型,:&/p&&div class=&highlight&&&pre&&code class=&language-text&&&span&&/span&TRAIN_FILE = '../data/census/census-income.data'
TEST_FILE = '../data/census/census-income.test'
df_train = pd.read_csv(TRAIN_FILE, names=COLUMNS, skipinitialspace=True)
df_test = pd.read_csv(TEST_FILE, names=COLUMNS, skipinitialspace=True)
df_train = df_train.dropna(how='any', axis=0)
df_test = df_test.dropna(how='any', axis=0)
df_train[[
'detailed_industry_recode', 'detailed_occupation_recode', 'year',
'own_business_or_self_employed', 'veterans_benefits'
]] = df_train[[
'detailed_industry_recode', 'detailed_occupation_recode', 'year',
'own_business_or_self_employed', 'veterans_benefits'
]].astype(str)
'detailed_industry_recode', 'detailed_occupation_recode', 'year',
'own_business_or_self_employed', 'veterans_benefits'
]] = df_test[[
'detailed_industry_recode', 'detailed_occupation_recode', 'year',
'own_business_or_self_employed', 'veterans_benefits'
]].astype(str)
df_train[LABEL_COLUMN] = (
df_train[LABEL_COLUMN].apply(lambda x: '+' in x)).astype(int)
df_test[LABEL_COLUMN] = (
df_test[LABEL_COLUMN].apply(lambda x: '+' in x)).astype(int)
dtypess = df_train.dtypes
&/code&&/pre&&/div&&p&从硬盘读入数据之后,如何标记每个维度的属性比如continous var还是categorical var,这个在sklearn上是很方便使用preprocessing.OneHotEncoder()可以很方便的处理,TF.learn内部也有类似的逻辑,相对会比较复杂:&/p&&div class=&highlight&&&pre&&code class=&language-python&&&span&&/span&&span class=&n&&class_of_worker&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'class_of_worker'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&detailed_industry_recode&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'detailed_industry_recode'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&detailed_occupation_recode&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'detailed_occupation_recode'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&education&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'education'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&enroll_in_edu_inst_last_wk&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'enroll_in_edu_inst_last_wk'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&marital_stat&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'marital_stat'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&major_industry_code&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'major_industry_code'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&major_occupation_code&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'major_occupation_code'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&race&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'race'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&hispanic_origin&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'hispanic_origin'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&sex&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_keys&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'sex'&/span&&span class=&p&&,&/span& &span class=&n&&keys&/span&&span class=&o&&=&/span&&span class=&p&&[&/span&&span class=&s1&&'Female'&/span&&span class=&p&&,&/span& &span class=&s1&&'Male'&/span&&span class=&p&&])&/span&
&span class=&n&&member_of_labor_union&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'member_of_labor_union'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&reason_for_unemployment&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'reason_for_unemployment'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&full_or_part_time_employment_stat&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'full_or_part_time_employment_stat'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&tax_filer_stat&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'tax_filer_stat'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&region_of_previous_residence&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'region_of_previous_residence'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&state_of_previous_residence&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'state_of_previous_residence'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&detailed_household_and_family_stat&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'detailed_household_and_family_stat'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&detailed_household_summary_in_household&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'detailed_household_summary_in_household'&/span&&span class=&p&&,&/span&
&span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&migration_code_change_in_msa&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'migration_code_change_in_msa'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&migration_code_change_in_msa&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'migration_code_change_in_msa'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&migration_code_change_in_reg&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'migration_code_change_in_reg'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&migration_code_move_within_reg&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'migration_code_move_within_reg'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&live_in_this_house_1year_ago&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'live_in_this_house_1year_ago'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&migration_prev_res_in_sunbelt&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'migration_prev_res_in_sunbelt'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&family_members_under18&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'family_members_under18'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&country_of_birth_father&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'country_of_birth_father'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&country_of_birth_mother&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'country_of_birth_mother'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&country_of_birth_self&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'country_of_birth_self'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&citizenship&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'citizenship'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&own_business_or_self_employed&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'own_business_or_self_employed'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&fill_inc_questionnaire_for_veteran_admin&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'fill_inc_questionnaire_for_veteran_admin'&/span&&span class=&p&&,&/span&
&span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&veterans_benefits&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_hash_bucket&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'veterans_benefits'&/span&&span class=&p&&,&/span& &span class=&n&&hash_bucket_size&/span&&span class=&o&&=&/span&&span class=&mi&&1000&/span&&span class=&p&&)&/span&
&span class=&n&&year&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&sparse_column_with_keys&/span&&span class=&p&&(&/span&
&span class=&n&&column_name&/span&&span class=&o&&=&/span&&span class=&s1&&'year'&/span&&span class=&p&&,&/span& &span class=&n&&keys&/span&&span class=&o&&=&/span&&span class=&p&&[&/span&&span class=&s1&&'94'&/span&&span class=&p&&,&/span& &span class=&s1&&'95'&/span&&span class=&p&&])&/span&
&span class=&c1&&# Continuous base columns&/span&
&span class=&n&&age&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&&span class=&s1&&'age'&/span&&span class=&p&&)&/span&
&span class=&n&&age_buckets&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&bucketized_column&/span&&span class=&p&&(&/span&
&span class=&n&&age&/span&&span class=&p&&,&/span& &span class=&n&&boundaries&/span&&span class=&o&&=&/span&&span class=&p&&[&/span&&span class=&mi&&18&/span&&span class=&p&&,&/span& &span class=&mi&&25&/span&&span class=&p&&,&/span& &span class=&mi&&30&/span&&span class=&p&&,&/span& &span class=&mi&&35&/span&&span class=&p&&,&/span& &span class=&mi&&40&/span&&span class=&p&&,&/span& &span class=&mi&&45&/span&&span class=&p&&,&/span& &span class=&mi&&50&/span&&span class=&p&&,&/span& &span class=&mi&&55&/span&&span class=&p&&,&/span& &span class=&mi&&60&/span&&span class=&p&&,&/span& &span class=&mi&&65&/span&&span class=&p&&])&/span&
&span class=&n&&wage_per_hour&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&&span class=&s1&&'wage_per_hour'&/span&&span class=&p&&)&/span&
&span class=&n&&capital_gains&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&&span class=&s1&&'capital_gains'&/span&&span class=&p&&)&/span&
&span class=&n&&capital_losses&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&&span class=&s1&&'capital_losses'&/span&&span class=&p&&)&/span&
&span class=&n&&dividends_from_stocks&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&
&span class=&s1&&'dividends_from_stocks'&/span&&span class=&p&&)&/span&
&span class=&n&&instance_weight&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&&span class=&s1&&'instance_weight'&/span&&span class=&p&&)&/span&
&span class=&n&&weeks_worked_in_year&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&
&span class=&s1&&'weeks_worked_in_year'&/span&&span class=&p&&)&/span&
&span class=&n&&num_persons_worked_for_employer&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&layers&/span&&span class=&o&&.&/span&&span class=&n&&real_valued_column&/span&&span class=&p&&(&/span&
&span class=&s1&&'num_persons_worked_for_employer'&/span&&span class=&p&&)&/span&
&/code&&/pre&&/div&&p&real_valued_column 主要做连续性的特征,对categorical var这里有两种处理方式:一种是sparse_column_with_keys;另一种是sparse_column_with_hash_bucket,把对应的categorical var转换为对应的数字index。&/p&&div class=&highlight&&&pre&&code class=&language-python&&&span&&/span&&span class=&k&&def&/span& &span class=&nf&&input_fn&/span&&span class=&p&&(&/span&&span class=&n&&df&/span&&span class=&p&&):&/span&
&span class=&c1&&# Creates a dictionary mapping from each continuous feature column name (k) to&/span&
&span class=&c1&&# # the values of that column stored in a constant Tensor.&/span&
&span class=&n&&continuous_cols&/span& &span class=&o&&=&/span& &span class=&p&&{&/span&
&span class=&n&&k&/span&&span class=&p&&:&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&constant&/span&&span class=&p&&(&/span&&span class=&n&&df&/span&&span class=&p&&[&/span&&span class=&n&&k&/span&&span class=&p&&]&/span&&span class=&o&&.&/span&&span class=&n&&values&/span&&span class=&p&&)&/span&
&span class=&k&&for&/span& &span class=&n&&k&/span& &span class=&ow&&in&/span& &span class=&n&&CONTINUOUS_COLUMNS&/span&
&span class=&p&&}&/span&
&span class=&c1&&# Creates a dictionary mapping from each categorical feature column name (k)&/span&
&span class=&c1&&# to the values of that column stored in a tf.SparseTensor.&/span&
&span class=&n&&categorical_cols&/span& &span class=&o&&=&/span& &span class=&p&&{&/span&
&span class=&n&&k&/span&&span class=&p&&:&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&SparseTensor&/span&&span class=&p&&(&/span&
&span class=&n&&indices&/span&&span class=&o&&=&/span&&span class=&p&&[[&/span&&span class=&n&&i&/span&&span class=&p&&,&/span& &span class=&mi&&0&/span&&span class=&p&&]&/span& &span class=&k&&for&/span& &span class=&n&&i&/span& &span class=&ow&&in&/span& &span class=&nb&&range&/span&&span class=&p&&(&/span&&span class=&n&&df&/span&&span class=&p&&[&/span&&span class=&n&&k&/span&&span class=&p&&]&/span&&span class=&o&&.&/span&&span class=&n&&size&/span&&span class=&p&&)],&/span&
&span class=&n&&values&/span&&span class=&o&&=&/span&&span class=&n&&df&/span&&span class=&p&&[&/span&&span class=&n&&k&/span&&span class=&p&&]&/span&&span class=&o&&.&/span&&span class=&n&&values&/span&&span class=&p&&,&/span&
&span class=&n&&dense_shape&/span&&span class=&o&&=&/span&&span class=&p&&[&/span&&span class=&n&&df&/span&&span class=&p&&[&/span&&span class=&n&&k&/span&&span class=&p&&]&/span&&span class=&o&&.&/span&&span class=&n&&size&/span&&span class=&p&&,&/span& &span class=&mi&&1&/span&&span class=&p&&])&/span&
&span class=&k&&for&/span& &span class=&n&&k&/span& &span class=&ow&&in&/span& &span class=&n&&CATEGORICAL_COLUMNS&/span&
&span class=&p&&}&/span&
&span class=&c1&&# Merges the two dictionaries into one.&/span&
&span class=&n&&feature_cols&/span& &span class=&o&&=&/span& &span class=&nb&&dict&/span&&span class=&p&&(&/span&&span class=&n&&continuous_cols&/span&&span class=&o&&.&/span&&span class=&n&&items&/span&&span class=&p&&()&/span& &span class=&o&&+&/span& &span class=&n&&categorical_cols&/span&&span class=&o&&.&/span&&span class=&n&&items&/span&&span class=&p&&())&/span&
&span class=&c1&&# Converts the label column into a constant Tensor.&/span&
&span class=&n&&label&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&constant&/span&&span class=&p&&(&/span&&span class=&n&&df&/span&&span class=&p&&[&/span&&span class=&n&&LABEL_COLUMN&/span&&span class=&p&&]&/span&&span class=&o&&.&/span&&span class=&n&&values&/span&&span class=&p&&)&/span&
&span class=&c1&&# Returns the feature columns and the label.&/span&
&span class=&k&&return&/span& &span class=&n&&feature_cols&/span&&span class=&p&&,&/span& &span class=&n&&label&/span&
&/code&&/pre&&/div&&p&在经过特征的处理之后,由于我们这里数据没有直接格式化分开成data、target,所以我们要做一个input_fn的处理,将输入处理,参考仓库源码,将连续性特征转换为列名和constant值的dict,categorical转化为特殊格式的SparseTensor格式。&/p&&h3&模型训练&/h3&&p&数据处理好之后,做模型训练就比较容易了,如下图,配置好对应的FEATURE_COLUMNS和要保存model的路径就好了&/p&&div class=&highlight&&&pre&&code class=&language-python&&&span&&/span&&span class=&k&&def&/span& &span class=&nf&&train_input_fn&/span&&span class=&p&&():&/span&
&span class=&k&&return&/span& &span class=&n&&input_fn&/span&&span class=&p&&(&/span&&span class=&n&&df_train&/span&&span class=&p&&)&/span&
&span class=&k&&def&/span& &span class=&nf&&eval_input_fn&/span&&span class=&p&&():&/span&
&span class=&k&&return&/span& &span class=&n&&input_fn&/span&&span class=&p&&(&/span&&span class=&n&&df_test&/span&&span class=&p&&)&/span&
&span class=&n&&model_dir&/span& &span class=&o&&=&/span& &span class=&s1&&'../model_dir'&/span&
&span class=&n&&model&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&learn&/span&&span class=&o&&.&/span&&span class=&n&&LinearClassifier&/span&&span class=&p&&(&/span&
&span class=&n&&feature_columns&/span&&span class=&o&&=&/span&&span class=&n&&FEATURE_COLUMNS&/span&&span class=&p&&,&/span& &span class=&n&&model_dir&/span&&span class=&o&&=&/span&&span class=&n&&model_dir&/span&&span class=&p&&)&/span&
&span class=&n&&model&/span&&span class=&o&&.&/span&&span class=&n&&fit&/span&&span class=&p&&(&/span&&span class=&n&&input_fn&/span&&span class=&o&&=&/span&&span class=&n&&train_input_fn&/span&&span class=&p&&,&/span& &span class=&n&&steps&/span&&span class=&o&&=&/span&&span class=&mi&&200&/span&&span class=&p&&)&/span&
&span class=&n&&results&/span& &span class=&o&&=&/span& &span class=&n&&model&/span&&span class=&o&&.&/span&&span class=&n&&evaluate&/span&&span class=&p&&(&/span&&span class=&n&&input_fn&/span&&span class=&o&&=&/span&&span class=&n&&eval_input_fn&/span&&span class=&p&&,&/span& &span class=&n&&steps&/span&&span class=&o&&=&/span&&span class=&mi&&1&/span&&span class=&p&&)&/span&
&span class=&k&&for&/span& &span class=&n&&key&/span& &span class=&ow&&in&/span& &span class=&nb&&sorted&/span&&span class=&p&&(&/span&&span class=&n&&results&/span&&span class=&p&&):&/span&
&span class=&k&&print&/span&&span class=&p&&(&/span&&span class=&s2&&&&/span&&span class=&si&&%s&/span&&span class=&s2&&: &/span&&span class=&si&&%s&/span&&span class=&s2&&&&/span& &span class=&o&&%&/span& &span class=&p&&(&/span&&span class=&n&&key&/span&&span class=&p&&,&/span& &span class=&n&&results&/span&&span class=&p&&[&/span&&span class=&n&&key&/span&&span class=&p&&]))&/span&
&/code&&/pre&&/div&&p&最终结果如下图:&br&&img src=&/v2-9fa339d91d043a3f89e2e7d55cdc0e89_b.png& data-rawwidth=&3358& data-rawheight=&552& class=&origin_image zh-lightbox-thumb& width=&3358& data-original=&/v2-9fa339d91d043a3f89e2e7d55cdc0e89_r.png&&这里,我仅仅是使用TF.Learn的LinearClassifier做了一个小的demo,后面会有其他算法,之后会加上更多的小技巧,如何更方便的在TF.Learn中用好机器学习。&br&具体代码见&a href=&/?target=https%3A///burness/tensorflow-101/blob/master/machinelearning_toolkit/scripts/linear_classifier.py& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&tensorflow-101/machinelearning_toolkit/scripts/linear_classifier.py&i class=&icon-external&&&/i&&/a&&/p&&h3&Support Vector Machine&/h3&&p&支持向量机使用方法差不多,基本上可以复用linear_classifier.py中的代码,这里有三个比较不同的地方:&br&1. SVM需要有一个example_id的列需要指定,所以我们需要在input_fn中将其加上;&br&2. SVM的调用底层有一个reshape的bug,我在玩svm的过程发现了,具体描述在这儿&a href=&/?target=http%3A///questions//check-failed-ndims-dims-2-vs-1-when-i-build-a-svm-model& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&check-failed-ndims-dims-2-vs-1-when-i-build-a-svm-model&i class=&icon-external&&&/i&&/a&,大概原因是对连续值特征比如个数是200,而值的shape是(200,)而非(200, 1),提了个issue &a href=&/?target=https%3A///tensorflow/tensorflow/issues/9505& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&Check failed: NDIMS == dims() (2 vs. 1) when I build a svm model&i class=&icon-external&&&/i&&/a&,后面RandomForest也有类似的问题,等着后续修复,暂时的解决方法是原先的continuous_cols修改为:continuous_cols = {k: tf.constant(df[k].values) for k in CONTINUOUS_COLUMNS};&br&3. 模型替代SVM:&br&&/p&&div class=&highlight&&&pre&&code class=&language-text&&&span&&/span&model_dir = ‘../svm_model_dir’
model = svm.SVM(example_id_column=’example_id’,
feature_columns=FEATURE_COLUMNS,
model_dir=model_dir)
model.fit(input_fn=train_input_fn, steps=10)
results = model.evaluate(input_fn=eval_input_fn, steps=1)
for key in sorted(results):
print(“%s: %s” % (key, results[key]))
svm的代码见:tensorflow-101/machinelear
&/code&&/pre&&/div&&p&&a href=&/?target=https%3A///burness/tensorflow-101/blob/master/machinelearning_toolkit/scripts/tf-svm.py& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&ning_toolkit/scripts/tf-svm.py&i class=&icon-external&&&/i&&/a&&br&一个重现real column bug的例子&a href=&/?target=https%3A///burness/tensorflow-101/blob/master/machinelearning_toolkit/scripts/simple-tf-svm.py& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&https://&/span&&span class=&visible&&/burness/tens&/span&&span class=&invisible&&orflow-101/blob/master/machinelearning_toolkit/scripts/simple-tf-svm.py&/span&&span class=&ellipsis&&&/span&&i class=&icon-external&&&/i&&/a&&/p&&p&最终100个step的结果:&br&&/p&&h3&RandomForest&/h3&&p&随机森林的模型和linearClassifier的使用接口也有点差异,模型定义和训练的地方改为:&/p&&div class=&highlight&&&pre&&code class=&language-text&&&span&&/span&validation_metrics = {
&accuracy&:
tf.contrib.learn.MetricSpec(
metric_fn=tf.contrib.metrics.streaming_accuracy,
prediction_key='probabilities'
&precision&:
tf.contrib.learn.MetricSpec(
metric_fn=tf.contrib.metrics.streaming_precision,
prediction_key='probabilities'
tf.contrib.learn.MetricSpec(
metric_fn=tf.contrib.metrics.streaming_recall,
prediction_key='probabilities'
hparams = tf.contrib.tensor_forest.python.tensor_forest.ForestHParams(
num_trees=10,
max_nodes=1000,
num_classes=2,
num_features=len(CONTINUOUS_COLUMNS) + len(CATEGORICAL_COLUMNS))
classifier = random_forest.TensorForestEstimator(hparams, model_dir=model_dir, config=tf.contrib.learn.RunConfig(save_checkpoints_secs=60))
classifier.fit(input_fn=train_input_fn, steps=200)
results = classifier.evaluate(
input_fn=eval_input_fn, steps=1, metrics=validation_metrics)
print results
for key in sorted(results):
print(&%s: %s& % (key, results[key]))
&/code&&/pre&&/div&&p&而且由于在训练的时候,前面linearClassifier和SVM都是没有任何输出,不是很友好,查了TensorFlow的文档,可以在训练过程中输出相关信息,只需要加一行tf.logging.set_verbosity(&a href=&/?target=http%3A//& class=& external& target=&_blank& rel=&nofollow noreferrer&&&span class=&invisible&&http://&/span&&span class=&visible&&&/span&&span class=&invisible&&&/span&&i class=&icon-external&&&/i&&/a&)就可输出训练过程中的loss信息:&/p&&p&&img src=&/v2-92f0c925b4a1caef10f2_b.png& data-rawwidth=&3356& data-rawheight=&446& class=&origin_image zh-lightbox-thumb& width=&3356& data-original=&/v2-92f0c925b4a1caef10f2_r.png&&当然这里是很粗糙的,另外不知道怎么的RF的evaluate没有accuracy的输出,为了输出相关的信息,我这里定义了validation_metrics传递给evaluate即可,后面在wide and deep的实验中会详细描述,最终结果:&br&&img src=&/v2-672a92b07b312f9f0b5f0d826eee0dd2_b.png& data-rawwidth=&3348& data-rawheight=&376& class=&origin_image zh-lightbox-thumb& width=&3348& data-original=&/v2-672a92b07b312f9f0b5f0d826eee0dd2_r.png&&&/p&&p&RF的源码见:&a href=&/?target=https%3A///burness/tensorflow-101/blob/master/machinelearning_toolkit/scripts/tf-rf.py& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&tensorflow-101/machinelearning_toolkit/scripts/tf-rf.py&i class=&icon-external&&&/i&&/a&&/p&&p&wide and deep&/p&&p&wide and deep可以很方便的在TF.Learn中定义使用,比较复杂的是做feature的一些处理,如wide column一般对实数列做bucket处理,如age_buckets = tf.contrib.layers.bucketized_column(age, boundaries=[18, 25, 30, 35, 40, 45, 50, 55, 60, 65]),这里需要给定boundaries,将连续值离散化,这里不知道是否有不需要指定boundaries的api或者按比例自己计算的,这个我后续调研下,离散后之后,可直接为wide列,但是通常会做更多的cross column:&br&tf.contrib.layers.crossed_column(columns=[age_buckets, class_of_worker], hash_bucket_size=1000)&br&说道这里,想起前些日子的一些事情,不得不吐槽下之前的和某大厂的人聊,花了10分钟让我解释了简单特征组合相当于高纬度的特征,然后各种奇葩的弱智问题,拜托各位大厂的人招聘上上点心,面试官至少得靠谱点吧。&br&这里为了代码的简单,我就只错了两个维度的cross_column,以以前的经验来说,通常在特征维度上cross column这种效果提升会比较明显,尤其是linearClassifier这种线性模型。&/p&&p&deep的列通常不需要对连续性特征做多少处理,主要对categorical var在离线化之后需要向量化,通常会使用one_hot_column和embedding_column,通常one_hot_column会对sex、year这类值很容易穷举的,可取值不多,而embedding_column会重新向量化categorical var,官方源码里面有对这部分进行说明&a href=&/?target=https%3A///tensorflow/tensorflow/blob/r1.1/tensorflow/contrib/layers/python/layers/feature_column.py& class=& wrap external& target=&_blank& rel=&nofollow noreferrer&&tensorflow/contrib/layers/python/layers/feature_column.py&i class=&icon-external&&&/i&&/a&具体里面的算法暂时还不太清楚,后面我会来细细研究下。&/p&&p&基本上feature的处理就是这样,然后就是模型了:&/p&&div class=&highlight&&&pre&&code class=&language-python&&&span&&/span&&span class=&n&&validation_metrics&/span& &span class=&o&&=&/span& &span class=&p&&{&/span&
&span class=&s2&&&accuracy&&/span&&span class=&p&&:&/span&
&span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&learn&/span&&span class=&o&&.&/span&&span class=&n&&MetricSpec&/span&&span class=&p&&(&/span&
&span class=&n&&metric_fn&/span&&span class=&o&&=&/span&&span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&metrics&/span&&span class=&o&&.&/span&&span class=&n&&streaming_accuracy&/span&&span class=&p&&,&/span&
&span class=&n&&prediction_key&/span&&span class=&o&&=&/span&&span class=&s2&&&classes&&/span&&span class=&p&&),&/span&
&span class=&s2&&&precision&&/span&&span class=&p&&:&/span&
&span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&learn&/span&&span class=&o&&.&/span&&span class=&n&&MetricSpec&/span&&span class=&p&&(&/span&
&span class=&n&&metric_fn&/span&&span class=&o&&=&/span&&span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&metrics&/span&&span class=&o&&.&/span&&span class=&n&&streaming_precision&/span&&span class=&p&&,&/span&
&span class=&n&&prediction_key&/span&&span class=&o&&=&/span&&span class=&s2&&&classes&&/span&&span class=&p&&),&/span&
&span class=&s2&&&recall&&/span&&span class=&p&&:&/span&
&span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&learn&/span&&span class=&o&&.&/span&&span class=&n&&MetricSpec&/span&&span class=&p&&(&/span&
&span class=&n&&metric_fn&/span&&span class=&o&&=&/span&&span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&metrics&/span&&span class=&o&&.&/span&&span class=&n&&streaming_recall&/span&&span class=&p&&,&/span&
&span class=&n&&prediction_key&/span&&span class=&o&&=&/span&&span class=&s2&&&classes&&/span&&span class=&p&&)&/span&
&span class=&p&&}&/span&
&span class=&n&&validation_monitor&/span& &span class=&o&&=&/span& &span class=&n&&tf&/span&&span class=&o&&.&/span&&span class=&n&&contrib&/span&&span class=&o&&.&/span&&span class=&n&&learn&/span&&span class=&o&&.&/span&&span class=&n&&monitors&/span&&span class=&o&&.&/span&&span class=&n&&ValidationMonitor&/span&&span class=&p&&(&/span&&span class=&n&&input_fn&/span&&span class=&o&&=&/span&&span class=&n&&eval_input_fn&/span&&span class=&p&&,&/span&
&span class=&n&&every_n_steps&/span&&span class=&o&&=&/span&&span class=&mi&&10&/span&&span class=&p&&,&/span& &span class=&n&&metrics&/span&&span class=&o&&=&/span&&span class=&n&&validation_metrics&/span&&span class=&p&&,&/span& &span class=&n&&e}

我要回帖

更多关于 pygame python3.6 的文章

更多推荐

版权声明:文章内容来源于网络,版权归原作者所有,如有侵权请点击这里与我们联系,我们将及时删除。

点击添加站长微信