Nltk: Suppression de la prise en charge de Python 2.7

Créé le 10 mai 2019  ·  4Commentaires  ·  Source: nltk/nltk

Depuis https://travis-ci.org/nltk/nltk/jobs/530566954 , certaines des dépendances dont dépend NLTK ne prennent plus en charge Python 2.7. Je pense que c'est le bon moment pour abandonner également la prise en charge de Python 2.7 afin que notre CI continue de fonctionner et de faire avancer la bibliothèque.

admin python2.7 python3 pythonic

Commentaire le plus utile

Avec la prise en charge de Python 3.8 depuis #2432 , il est peut-être temps d'abandonner officiellement Python 2.7 ? Nous sommes en 2020 =)

P/S : C'est déjà sorti de notre flux de travail CI depuis un bon moment.

Tous les 4 commentaires

Je vois, ok, il est temps de passer à autre chose !

À partir de ces __future__ =)

~/git-stuff/my-contrib/nltk$ grep "__future__" nltk/**/*
nltk/app/chartparser_app.py:from __future__ import division
nltk/app/chunkparser_app.py:from __future__ import division
nltk/app/collocations_app.py:from __future__ import division
nltk/app/rdparser_app.py:from __future__ import division
nltk/app/wordnet_app.py:from __future__ import print_function
nltk/ccg/api.py:from __future__ import unicode_literals
nltk/ccg/chart.py:from __future__ import print_function, division, unicode_literals
nltk/ccg/combinator.py:from __future__ import unicode_literals
nltk/ccg/lexicon.py:from __future__ import unicode_literals
nltk/chat/__init__.py:from __future__ import print_function
nltk/chat/eliza.py:from __future__ import print_function
nltk/chat/iesha.py:from __future__ import print_function
nltk/chat/rude.py:from __future__ import print_function
nltk/chat/suntsu.py:from __future__ import print_function
nltk/chat/util.py:from __future__ import print_function
nltk/chat/zen.py:from __future__ import print_function
nltk/chunk/named_entity.py:from __future__ import print_function
nltk/chunk/named_entity.py:from __future__ import unicode_literals
nltk/chunk/regexp.py:from __future__ import print_function, unicode_literals
nltk/chunk/regexp.py:from __future__ import division
nltk/chunk/util.py:from __future__ import print_function, unicode_literals, division
nltk/classify/decisiontree.py:from __future__ import print_function, unicode_literals, division
nltk/classify/maxent.py:from __future__ import print_function, unicode_literals
nltk/classify/megam.py:from __future__ import print_function
nltk/classify/naivebayes.py:from __future__ import print_function, unicode_literals
nltk/classify/rte_classify.py:from __future__ import print_function
nltk/classify/scikitlearn.py:from __future__ import print_function, unicode_literals
nltk/classify/senna.py:    >>> from __future__ import unicode_literals
nltk/classify/senna.py:from __future__ import unicode_literals
nltk/classify/tadm.py:from __future__ import print_function, unicode_literals
nltk/classify/textcat.py:from __future__ import print_function, unicode_literals
nltk/classify/util.py:from __future__ import print_function, division
nltk/classify/weka.py:from __future__ import print_function
nltk/cluster/em.py:from __future__ import print_function, unicode_literals
nltk/cluster/gaac.py:from __future__ import print_function, unicode_literals, division
nltk/cluster/kmeans.py:from __future__ import print_function, unicode_literals, division
nltk/cluster/util.py:from __future__ import print_function, unicode_literals, division
grep: nltk/corpus/reader: Is a directory
nltk/corpus/util.py:from __future__ import unicode_literals
nltk/draw/table.py:from __future__ import division
nltk/inference/api.py:from __future__ import print_function
nltk/inference/discourse.py:from __future__ import print_function
nltk/inference/mace.py:from __future__ import print_function
nltk/inference/nonmonotonic.py:from __future__ import print_function, unicode_literals
nltk/inference/prover9.py:from __future__ import print_function
nltk/inference/resolution.py:from __future__ import print_function, unicode_literals
nltk/inference/tableau.py:from __future__ import print_function, unicode_literals
nltk/lm/api.py:from __future__ import division, unicode_literals
nltk/lm/counter.py:from __future__ import unicode_literals
nltk/lm/models.py:from __future__ import division, unicode_literals
nltk/lm/vocabulary.py:from __future__ import unicode_literals
nltk/metrics/agreement.py:from __future__ import print_function, unicode_literals, division
nltk/metrics/aline.py:from __future__ import unicode_literals
nltk/metrics/association.py:from __future__ import division
nltk/metrics/confusionmatrix.py:from __future__ import print_function, unicode_literals
nltk/metrics/distance.py:from __future__ import print_function
nltk/metrics/distance.py:from __future__ import division
nltk/metrics/scores.py:from __future__ import print_function, division
nltk/metrics/spearman.py:from __future__ import division
nltk/misc/babelfish.py:from __future__ import print_function
nltk/misc/chomsky.py:from __future__ import print_function
nltk/misc/sort.py:from __future__ import print_function, division
nltk/misc/wordfinder.py:from __future__ import print_function
nltk/parse/bllip.py:from __future__ import print_function
nltk/parse/chart.py:from __future__ import print_function, division, unicode_literals
nltk/parse/corenlp.py:from __future__ import unicode_literals
nltk/parse/dependencygraph.py:from __future__ import print_function, unicode_literals
nltk/parse/earleychart.py:from __future__ import print_function, division
nltk/parse/evaluate.py:from __future__ import division
nltk/parse/featurechart.py:from __future__ import print_function, unicode_literals
nltk/parse/generate.py:from __future__ import print_function
nltk/parse/malt.py:from __future__ import print_function, unicode_literals
nltk/parse/nonprojectivedependencyparser.py:from __future__ import print_function
nltk/parse/pchart.py:from __future__ import print_function, unicode_literals
nltk/parse/projectivedependencyparser.py:from __future__ import print_function, unicode_literals
nltk/parse/recursivedescent.py:from __future__ import print_function, unicode_literals
nltk/parse/shiftreduce.py:from __future__ import print_function, unicode_literals
nltk/parse/stanford.py:from __future__ import unicode_literals
nltk/parse/transitionparser.py:from __future__ import absolute_import
nltk/parse/transitionparser.py:from __future__ import division
nltk/parse/transitionparser.py:from __future__ import print_function
nltk/parse/util.py:from __future__ import print_function
nltk/parse/viterbi.py:from __future__ import print_function, unicode_literals
nltk/sem/boxer.py:from __future__ import print_function, unicode_literals
nltk/sem/chat80.py:from __future__ import print_function, unicode_literals
nltk/sem/cooper_storage.py:from __future__ import print_function
nltk/sem/drt.py:from __future__ import print_function, unicode_literals
nltk/sem/evaluate.py:from __future__ import print_function, unicode_literals
nltk/sem/glue.py:from __future__ import print_function, division, unicode_literals
nltk/sem/hole.py:from __future__ import print_function, unicode_literals
nltk/sem/lfg.py:from __future__ import print_function, division, unicode_literals
nltk/sem/linearlogic.py:from __future__ import print_function, unicode_literals
nltk/sem/logic.py:from __future__ import print_function, unicode_literals
nltk/sem/relextract.py:from __future__ import print_function
nltk/sem/util.py:from __future__ import print_function, unicode_literals
nltk/sentiment/sentiment_analyzer.py:from __future__ import print_function
nltk/sentiment/util.py:from __future__ import division
nltk/stem/arlstem.py:from __future__ import unicode_literals
nltk/stem/cistem.py:from __future__ import unicode_literals
nltk/stem/isri.py:from __future__ import unicode_literals
nltk/stem/lancaster.py:from __future__ import unicode_literals
nltk/stem/porter.py:from __future__ import print_function, unicode_literals
nltk/stem/regexp.py:from __future__ import unicode_literals
nltk/stem/rslp.py:from __future__ import print_function, unicode_literals
nltk/stem/snowball.py:from __future__ import unicode_literals, print_function
nltk/stem/wordnet.py:from __future__ import unicode_literals
nltk/tag/__init__.py:from __future__ import print_function
nltk/tag/brill.py:from __future__ import print_function, division
nltk/tag/brill_trainer.py:from __future__ import print_function, division
nltk/tag/crf.py:from __future__ import absolute_import
nltk/tag/crf.py:from __future__ import unicode_literals
nltk/tag/hmm.py:from __future__ import print_function, unicode_literals, division
nltk/tag/mapping.py:from __future__ import print_function, unicode_literals, division
nltk/tag/perceptron.py:from __future__ import absolute_import
nltk/tag/perceptron.py:from __future__ import print_function, division
nltk/tag/sequential.py:from __future__ import print_function, unicode_literals
nltk/tag/tnt.py:from __future__ import print_function, division
nltk/tbl/demo.py:from __future__ import print_function, absolute_import, division
nltk/tbl/erroranalysis.py:from __future__ import print_function
nltk/tbl/feature.py:from __future__ import division, print_function, unicode_literals
nltk/tbl/rule.py:from __future__ import print_function
nltk/tbl/template.py:from __future__ import print_function
nltk/test/childes_fixt.py:from __future__ import absolute_import
nltk/test/classify_fixt.py:from __future__ import absolute_import
nltk/test/compat_fixt.py:from __future__ import absolute_import
nltk/test/corpus_fixt.py:from __future__ import absolute_import
nltk/test/data.doctest:    >>> from __future__ import print_function
nltk/test/discourse_fixt.py:from __future__ import absolute_import
nltk/test/doctest_nose_plugin.py:from __future__ import print_function
nltk/test/featgram.doctest:    >>> from __future__ import print_function
nltk/test/featstruct.doctest:    >>> from __future__ import print_function
nltk/test/gensim_fixt.py:from __future__ import absolute_import
nltk/test/gluesemantics_malt_fixt.py:from __future__ import absolute_import
grep: nltk/test/images: Is a directory
nltk/test/inference_fixt.py:from __future__ import absolute_import
nltk/test/metrics.doctest:   >>> from __future__ import print_function
nltk/test/nonmonotonic_fixt.py:from __future__ import absolute_import
nltk/test/portuguese_en_fixt.py:from __future__ import absolute_import
nltk/test/probability_fixt.py:from __future__ import absolute_import
nltk/test/runtests.py:from __future__ import absolute_import, print_function
nltk/test/segmentation_fixt.py:from __future__ import absolute_import
nltk/test/semantics_fixt.py:from __future__ import absolute_import
nltk/test/simple.doctest:    >>> from __future__ import print_function
nltk/test/stem.doctest:    >>> from __future__ import print_function
nltk/test/tokenize.doctest:    >>> from __future__ import print_function
nltk/test/translate_fixt.py:from __future__ import absolute_import
grep: nltk/test/unit: Is a directory
nltk/test/util.doctest:    >>> from __future__ import print_function
nltk/test/wordnet.doctest:    >>> from __future__ import print_function, unicode_literals
nltk/test/wordnet_fixt.py:from __future__ import absolute_import
nltk/tokenize/casual.py:from __future__ import unicode_literals
nltk/tokenize/nist.py:from __future__ import unicode_literals
nltk/tokenize/punkt.py:from __future__ import print_function, unicode_literals, division
nltk/tokenize/regexp.py:from __future__ import unicode_literals
nltk/tokenize/repp.py:from __future__ import unicode_literals, print_function
nltk/tokenize/simple.py:from __future__ import unicode_literals
nltk/tokenize/sonority_sequencing.py:from __future__ import unicode_literals
nltk/tokenize/stanford.py:from __future__ import unicode_literals, print_function
nltk/tokenize/stanford_segmenter.py:from __future__ import unicode_literals, print_function
nltk/translate/api.py:from __future__ import print_function, unicode_literals
nltk/translate/bleu_score.py:from __future__ import division
nltk/translate/chrf_score.py:from __future__ import division
nltk/translate/gale_church.py:from __future__ import division
nltk/translate/gleu_score.py:from __future__ import division
nltk/translate/ibm1.py:from __future__ import division
nltk/translate/ibm2.py:from __future__ import division
nltk/translate/ibm3.py:from __future__ import division
nltk/translate/ibm4.py:from __future__ import division
nltk/translate/ibm5.py:from __future__ import division
nltk/translate/ibm_model.py:from __future__ import division
nltk/translate/metrics.py:from __future__ import division
nltk/translate/nist_score.py:from __future__ import division
nltk/translate/ribes_score.py:from __future__ import division
nltk/twitter/common.py:from __future__ import print_function
nltk/twitter/twitter_demo.py:from __future__ import print_function
nltk/twitter/util.py:from __future__ import print_function

Puis les codes de compatibilité :

~/git-stuff/my-contrib/nltk$ grep "compat" nltk/**/*
nltk/app/chartparser_app.py:        if not self._checkcompat():
nltk/app/chartparser_app.py:        if not self._checkcompat():
nltk/app/chartparser_app.py:        if not self._checkcompat():
nltk/app/chartparser_app.py:    def _checkcompat(self):
nltk/app/concordance_app.py:import nltk.compat
nltk/app/wordnet_app.py:from nltk import compat
nltk/app/wordnet_app.py:if compat.PY3:
nltk/ccg/api.py:from nltk.compat import python_2_unicode_compatible, unicode_repr
nltk/ccg/api.py:<strong i="6">@python_2_unicode_compatible</strong>
nltk/ccg/api.py:<strong i="7">@python_2_unicode_compatible</strong>
nltk/ccg/api.py:<strong i="8">@python_2_unicode_compatible</strong>
nltk/ccg/api.py:<strong i="9">@python_2_unicode_compatible</strong>
nltk/ccg/chart.py:from nltk.compat import python_2_unicode_compatible
nltk/ccg/chart.py:<strong i="10">@python_2_unicode_compatible</strong>
nltk/ccg/chart.py:<strong i="11">@python_2_unicode_compatible</strong>
nltk/ccg/chart.py:<strong i="12">@python_2_unicode_compatible</strong>
nltk/ccg/combinator.py:from nltk.compat import python_2_unicode_compatible
nltk/ccg/combinator.py:<strong i="13">@python_2_unicode_compatible</strong>
nltk/ccg/combinator.py:<strong i="14">@python_2_unicode_compatible</strong>
nltk/ccg/combinator.py:<strong i="15">@python_2_unicode_compatible</strong>
nltk/ccg/combinator.py:<strong i="16">@python_2_unicode_compatible</strong>
nltk/ccg/combinator.py:<strong i="17">@python_2_unicode_compatible</strong>
nltk/ccg/combinator.py:<strong i="18">@python_2_unicode_compatible</strong>
nltk/ccg/lexicon.py:from nltk.compat import python_2_unicode_compatible
nltk/ccg/lexicon.py:<strong i="19">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:from nltk.compat import python_2_unicode_compatible, unicode_repr
nltk/chunk/regexp.py:<strong i="20">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="21">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="22">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="23">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="24">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="25">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="26">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="27">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="28">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="29">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="30">@python_2_unicode_compatible</strong>
nltk/chunk/regexp.py:<strong i="31">@python_2_unicode_compatible</strong>
nltk/chunk/util.py:from nltk.compat import python_2_unicode_compatible
nltk/classify/decisiontree.py:from nltk.compat import python_2_unicode_compatible
nltk/classify/decisiontree.py:<strong i="32">@python_2_unicode_compatible</strong>
nltk/classify/maxent.py:from nltk import compat
nltk/classify/maxent.py:@compat.python_2_unicode_compatible
nltk/classify/megam.py:from nltk import compat
nltk/classify/scikitlearn.py:from nltk import compat
nltk/classify/scikitlearn.py:@compat.python_2_unicode_compatible
nltk/classify/senna.py:from nltk.compat import python_2_unicode_compatible
nltk/classify/senna.py:<strong i="33">@python_2_unicode_compatible</strong>
nltk/classify/textcat.py:from nltk.compat import PY3
nltk/cluster/em.py:from nltk.compat import python_2_unicode_compatible
nltk/cluster/em.py:<strong i="34">@python_2_unicode_compatible</strong>
nltk/cluster/gaac.py:from nltk.compat import python_2_unicode_compatible
nltk/cluster/gaac.py:<strong i="35">@python_2_unicode_compatible</strong>
nltk/cluster/kmeans.py:from nltk.compat import python_2_unicode_compatible
nltk/cluster/kmeans.py:<strong i="36">@python_2_unicode_compatible</strong>
nltk/cluster/util.py:from nltk.compat import python_2_unicode_compatible
nltk/cluster/util.py:<strong i="37">@python_2_unicode_compatible</strong>
grep: nltk/corpus/reader: Is a directory
nltk/corpus/util.py:from nltk.compat import python_2_unicode_compatible
nltk/corpus/util.py:<strong i="38">@python_2_unicode_compatible</strong>
nltk/draw/dispersion.py:    import nltk.compat
nltk/draw/table.py:    a way that's incompatible with the fact that ``Table`` behaves as a
nltk/inference/nonmonotonic.py:from nltk.compat import python_2_unicode_compatible
nltk/inference/nonmonotonic.py:<strong i="39">@python_2_unicode_compatible</strong>
nltk/inference/resolution.py:from nltk.compat import python_2_unicode_compatible
nltk/inference/resolution.py:<strong i="40">@python_2_unicode_compatible</strong>
nltk/inference/resolution.py:<strong i="41">@python_2_unicode_compatible</strong>
nltk/lm/counter.py:from nltk import compat
nltk/lm/counter.py:@compat.python_2_unicode_compatible
nltk/lm/models.py:from nltk import compat
nltk/lm/models.py:@compat.python_2_unicode_compatible
nltk/lm/models.py:@compat.python_2_unicode_compatible
nltk/lm/models.py:@compat.python_2_unicode_compatible
nltk/lm/vocabulary.py:from nltk import compat
nltk/lm/vocabulary.py:@compat.python_2_unicode_compatible
nltk/metrics/agreement.py:from nltk.compat import python_2_unicode_compatible
nltk/metrics/agreement.py:<strong i="42">@python_2_unicode_compatible</strong>
nltk/metrics/confusionmatrix.py:from nltk.compat import python_2_unicode_compatible
nltk/metrics/confusionmatrix.py:<strong i="43">@python_2_unicode_compatible</strong>
nltk/parse/chart.py:from nltk.compat import python_2_unicode_compatible, unicode_repr
nltk/parse/chart.py:<strong i="44">@python_2_unicode_compatible</strong>
nltk/parse/chart.py:<strong i="45">@python_2_unicode_compatible</strong>
nltk/parse/chart.py:<strong i="46">@python_2_unicode_compatible</strong>
nltk/parse/corenlp.py:                response = requests.get(requests.compat.urljoin(self.url, 'live'))
nltk/parse/corenlp.py:                response = requests.get(requests.compat.urljoin(self.url, 'ready'))
nltk/parse/dependencygraph.py:from nltk.compat import python_2_unicode_compatible
nltk/parse/dependencygraph.py:<strong i="47">@python_2_unicode_compatible</strong>
nltk/parse/featurechart.py:from nltk.compat import python_2_unicode_compatible
nltk/parse/featurechart.py:<strong i="48">@python_2_unicode_compatible</strong>
nltk/parse/pchart.py:from nltk.compat import python_2_unicode_compatible
nltk/parse/pchart.py:<strong i="49">@python_2_unicode_compatible</strong>
nltk/parse/projectivedependencyparser.py:from nltk.compat import python_2_unicode_compatible
nltk/parse/projectivedependencyparser.py:<strong i="50">@python_2_unicode_compatible</strong>
nltk/parse/projectivedependencyparser.py:<strong i="51">@python_2_unicode_compatible</strong>
nltk/parse/recursivedescent.py:from nltk.compat import unicode_repr
nltk/parse/shiftreduce.py:from nltk.compat import unicode_repr
nltk/parse/stanford.py:        # Windows is incompatible with NamedTemporaryFile() without passing in delete=False.
nltk/parse/viterbi.py:from nltk.compat import python_2_unicode_compatible
nltk/parse/viterbi.py:<strong i="52">@python_2_unicode_compatible</strong>
nltk/sem/boxer.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/boxer.py:<strong i="53">@python_2_unicode_compatible</strong>
nltk/sem/boxer.py:<strong i="54">@python_2_unicode_compatible</strong>
nltk/sem/boxer.py:<strong i="55">@python_2_unicode_compatible</strong>
nltk/sem/chat80.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/chat80.py:<strong i="56">@python_2_unicode_compatible</strong>
nltk/sem/drt.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/drt.py:<strong i="57">@python_2_unicode_compatible</strong>
nltk/sem/drt.py:<strong i="58">@python_2_unicode_compatible</strong>
nltk/sem/drt.py:<strong i="59">@python_2_unicode_compatible</strong>
nltk/sem/drt.py:<strong i="60">@python_2_unicode_compatible</strong>
nltk/sem/evaluate.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/evaluate.py:<strong i="61">@python_2_unicode_compatible</strong>
nltk/sem/evaluate.py:<strong i="62">@python_2_unicode_compatible</strong>
nltk/sem/evaluate.py:<strong i="63">@python_2_unicode_compatible</strong>
nltk/sem/glue.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/glue.py:<strong i="64">@python_2_unicode_compatible</strong>
nltk/sem/glue.py:<strong i="65">@python_2_unicode_compatible</strong>
nltk/sem/hole.py:from nltk import compat
nltk/sem/hole.py:@compat.python_2_unicode_compatible
nltk/sem/lfg.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/lfg.py:<strong i="66">@python_2_unicode_compatible</strong>
nltk/sem/linearlogic.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/linearlogic.py:<strong i="67">@python_2_unicode_compatible</strong>
nltk/sem/linearlogic.py:<strong i="68">@python_2_unicode_compatible</strong>
nltk/sem/linearlogic.py:<strong i="69">@python_2_unicode_compatible</strong>
nltk/sem/linearlogic.py:<strong i="70">@python_2_unicode_compatible</strong>
nltk/sem/linearlogic.py:<strong i="71">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:from nltk.compat import python_2_unicode_compatible
nltk/sem/logic.py:<strong i="72">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="73">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="74">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="75">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="76">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="77">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="78">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="79">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="80">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="81">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="82">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="83">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="84">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="85">@python_2_unicode_compatible</strong>
nltk/sem/logic.py:<strong i="86">@python_2_unicode_compatible</strong>
nltk/sem/util.py:    Check that interpret_sents() is compatible with legacy grammars that use
nltk/sentiment/util.py:        # The protocol=2 parameter is for python2 compatibility
nltk/sentiment/util.py:        (writer, outf) = outf_writer_compat(outfile, encoding, errors, gzip_compress)
nltk/sentiment/util.py:    from nltk.twitter.common import outf_writer_compat, extract_fields
nltk/sentiment/vader.py:# ensure Python 3 compatibility, and refactoring to achieve greater modularity.
nltk/stem/cistem.py:from nltk.compat import python_2_unicode_compatible
nltk/stem/cistem.py:<strong i="87">@python_2_unicode_compatible</strong>
nltk/stem/lancaster.py:from nltk.compat import python_2_unicode_compatible
nltk/stem/lancaster.py:<strong i="88">@python_2_unicode_compatible</strong>
nltk/stem/porter.py:from nltk.compat import python_2_unicode_compatible
nltk/stem/porter.py:<strong i="89">@python_2_unicode_compatible</strong>
nltk/stem/regexp.py:from nltk.compat import python_2_unicode_compatible
nltk/stem/regexp.py:<strong i="90">@python_2_unicode_compatible</strong>
nltk/stem/snowball.py:from nltk import compat
nltk/stem/snowball.py:@compat.python_2_unicode_compatible
nltk/stem/wordnet.py:from nltk.compat import python_2_unicode_compatible
nltk/stem/wordnet.py:<strong i="91">@python_2_unicode_compatible</strong>
nltk/tag/hmm.py:from nltk.compat import python_2_unicode_compatible
nltk/tag/hmm.py:<strong i="92">@python_2_unicode_compatible</strong>
nltk/tag/perceptron.py:from nltk.compat import python_2_unicode_compatible
nltk/tag/perceptron.py:<strong i="93">@python_2_unicode_compatible</strong>
nltk/tag/perceptron.py:                # changed protocol from -1 to 2 to make pickling Python 2 compatible
nltk/tag/senna.py:from nltk.compat import python_2_unicode_compatible
nltk/tag/senna.py:<strong i="94">@python_2_unicode_compatible</strong>
nltk/tag/senna.py:<strong i="95">@python_2_unicode_compatible</strong>
nltk/tag/senna.py:<strong i="96">@python_2_unicode_compatible</strong>
nltk/tag/sequential.py:from nltk.compat import python_2_unicode_compatible
nltk/tag/sequential.py:<strong i="97">@python_2_unicode_compatible</strong>
nltk/tag/sequential.py:<strong i="98">@python_2_unicode_compatible</strong>
nltk/tag/sequential.py:<strong i="99">@python_2_unicode_compatible</strong>
nltk/tag/sequential.py:<strong i="100">@python_2_unicode_compatible</strong>
nltk/tbl/rule.py:from nltk.compat import python_2_unicode_compatible, unicode_repr
nltk/tbl/rule.py:<strong i="101">@python_2_unicode_compatible</strong>
nltk/tbl/rule.py:                # for complete compatibility with the wordy format of nltk2
nltk/tbl/template.py:        An alternative calling convention (kept for backwards compatibility,
nltk/test/compat.doctest:NLTK comes with a Python 2.x/3.x compatibility layer, nltk.compat
nltk/test/compat.doctest:    >>> from nltk import compat
nltk/test/compat.doctest:    >>> compat.PY3
nltk/test/compat.doctest:<strong i="102">@python_2_unicode_compatible</strong>
nltk/test/compat.doctest:``@python_2_unicode_compatible`` decorator allows writing these methods
nltk/test/compat.doctest:in a way compatible with Python 3.x:
nltk/test/compat.doctest:    >>> from nltk.compat import python_2_unicode_compatible
nltk/test/compat.doctest:    >>> <strong i="103">@python_2_unicode_compatible</strong>
nltk/test/compat.doctest:There is no need to wrap a subclass with ``@python_2_unicode_compatible``
nltk/test/compat.doctest:    >>> <strong i="104">@python_2_unicode_compatible</strong>
nltk/test/compat.doctest:Applying ``@python_2_unicode_compatible`` to a subclass
nltk/test/compat.doctest:``nltk.compat.unicode_repr`` function may be used instead of ``repr`` and
nltk/test/compat.doctest:    >>> from nltk.compat import unicode_repr
nltk/test/compat.doctest:of objects which class was fixed by ``@python_2_unicode_compatible``
nltk/test/compat.doctest:    >>> <strong i="105">@python_2_unicode_compatible</strong>
nltk/test/compat.doctest:methods of classes fixed by ``@python_2_unicode_compatible``
nltk/test/compat_fixt.py:from nltk.compat import PY3
nltk/test/compat_fixt.py:        raise SkipTest("compat.doctest is for Python 2.x")
nltk/test/corpus.doctest:    >>> # We use float for backward compatibility with division in Python2.7
nltk/test/data.doctest:    >>> from nltk.compat import StringIO
nltk/test/floresta.txt:O norte-americano Pete Sampras foi afastado pelo seu compatriota Jim Courier (24? ATP) pelos parciais de 7-6 (7-5), 6-4, o que significa que o n?mero um do mundo vai chegar ? ?catedral da terra batida?, Roland Garros, com duas derrotas em outros tantos encontros disputados sobre o p? de tijolo.
grep: nltk/test/images: Is a directory
nltk/test/index.doctest:.. _compat howto: compat.html
nltk/test/index.doctest:* `compat HOWTO`_
nltk/test/portuguese_en_fixt.py:from nltk.compat import PY3
nltk/test/semantics.doctest:In order to provide backwards compatibility with 'legacy' grammars where the semantics value
nltk/test/twitter.ipynb:    "from nltk.compat import StringIO\n",
grep: nltk/test/unit: Is a directory
nltk/tokenize/punkt.py:from nltk.compat import unicode_repr, python_2_unicode_compatible
nltk/tokenize/punkt.py:<strong i="106">@python_2_unicode_compatible</strong>
nltk/tokenize/regexp.py:from nltk.compat import python_2_unicode_compatible
nltk/tokenize/regexp.py:<strong i="107">@python_2_unicode_compatible</strong>
nltk/tokenize/stanford.py:        # Windows is incompatible with NamedTemporaryFile() without passing in delete=False.
nltk/tokenize/stanford_segmenter.py:from nltk import compat
nltk/translate/api.py:from nltk.compat import python_2_unicode_compatible
nltk/translate/api.py:<strong i="108">@python_2_unicode_compatible</strong>
nltk/translate/api.py:<strong i="109">@python_2_unicode_compatible</strong>
nltk/translate/bleu_score.py:    from nltk.compat import Fraction
nltk/twitter/api.py:from nltk.compat import UTC
nltk/twitter/common.py:from nltk import compat
nltk/twitter/common.py:    (writer, outf) = outf_writer_compat(outfile, encoding, errors, gzip_compress)
nltk/twitter/common.py:def outf_writer_compat(outfile, encoding, errors, gzip_compress=False):
nltk/twitter/common.py:    if compat.PY3:
nltk/twitter/common.py:        writer = compat.UnicodeWriter(outf, encoding=encoding, errors=errors)
nltk/twitter/common.py:    (writer, outf) = outf_writer_compat(outfile, encoding, errors, gzip_compress)
nltk/twitter/twitter_demo.py:from nltk.compat import StringIO

Et quelques fruits à portée de main :

~/git-stuff/my-contrib/nltk$ grep "sys.version" nltk/**/*
grep: nltk/corpus/reader: Is a directory
nltk/lm/vocabulary.py:    if sys.version_info[0] == 2:
nltk/sem/evaluate.py:    if sys.version_info[0] >= 3:
nltk/sentiment/util.py:    if sys.version_info[0] == 3:
nltk/sentiment/util.py:    elif sys.version_info[0] < 3:
grep: nltk/test/images: Is a directory
grep: nltk/test/unit: Is a directory

Avec la prise en charge de Python 3.8 depuis #2432 , il est peut-être temps d'abandonner officiellement Python 2.7 ? Nous sommes en 2020 =)

P/S : C'est déjà sorti de notre flux de travail CI depuis un bon moment.

Si #2504 est fusionné, il reste encore quelques éléments de Python 2 :

  • [x] Le plugin de test dans nltk/test/doctest_nose_plugin.py qui ignore le premier u des littéraux Unicode et s'assure que les comparaisons d'égalité réussissent. Si l'on veut supprimer ce plugin, il faudra s'assurer que tous les littéraux Unicode sont supprimés des tests.
  • [ ] web/dev/local_testing.rst mentionne toujours Python 2 et devrait être mis à jour.
  • [ ] les mentions de Python 2.7 et Python 2.6 dans le code doivent vérifier si elles sont toujours pertinentes
Cette page vous a été utile?
0 / 5 - 0 notes