Document warning cleanup + new version of docs builder

This commit is contained in:
Ben Wyeth
2019-04-11 14:43:47 +01:00
committed by Anthony Keenan
parent 87720163f8
commit f9916e673c
30 changed files with 1895 additions and 47 deletions

View File

@ -0,0 +1 @@
Pygments*.whl

View File

@ -0,0 +1,13 @@
FROM python:2-stretch
RUN apt-get update \
&& apt-get --no-install-recommends install -y texlive preview-latex-style texlive-generic-extra texlive-latex-extra latexmk dos2unix \
&& apt-get -y clean \
&& rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
ENV PATH="/opt/docs_builder:${PATH}"
WORKDIR /opt/docs_builder
COPY requirements.txt requirements.txt
COPY docs_builder/lexer-fix/Pygments*.whl .
RUN pip install -r requirements.txt
RUN pip install Pygments*.whl --force-reinstall

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,35 @@
# Pygments lexer
We were getting a lot of warnings in the docs build, and people were unable to see the real warnings due to this. So we're on a mission
to sort all of these out.
A lot of the errors were because the kotlin lexer in Pygments (the syntax highlighter that sphinx uses) didn't cope with a lot of the
kotlin syntax that we use.
We have fixes for the kotlin lexer that we are trying to get checked into Pygments, but while this is taking place we need to maintain a
slightly hacked corda/docs-build docker image in which to build the docs.
## Some notes on building and testing
The sphinx/pygments brigade have delightfully decided that mercurial is a good idea. So broadly speaking, to build/test a fix:
* checkout pygments from [here](https://bitbucket.org/birkenfeld/pygments-main/overview)
* copy the two python files in (might be worth diffing - they're based on 2.3.1 - nb the kotlin test is entirely new)
* build pygments whl file
```
cd /path/to/pygments/
python setup.py install
pip install wheel
wheel convert dist/Pygments-2.3.1.dev20190 # obviously use your version
cp Pygments-2.3.1.dev20190401-py27-none-any.whl /path/to/corda/docs/source/docs_builder/lexer-fix
```
* build the latest docker build (see docs readme)
```
cd docs
docker build -t corda/docs-builder:latest -f docs_builder/lexer-fix/Dockerfile .
```
* push the new image up to docker hub (nb you can also test by going to /opt/docs and running `./make-docsite.sh`)

View File

@ -0,0 +1,131 @@
# -*- coding: utf-8 -*-
"""
Basic JavaLexer Test
~~~~~~~~~~~~~~~~~~~~
:copyright: Copyright 2006-2017 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
import unittest
from pygments.token import Text, Name, Operator, Keyword, Number, Punctuation, String
from pygments.lexers import KotlinLexer
class KotlinTest(unittest.TestCase):
def setUp(self):
self.lexer = KotlinLexer()
self.maxDiff = None
def testCanCopeWithBackTickNamesInFunctions(self):
fragment = u'fun `wo bble`'
tokens = [
(Keyword, u'fun'),
(Text, u' '),
(Name.Function, u'`wo bble`'),
(Text, u'\n')
]
self.assertEqual(tokens, list(self.lexer.get_tokens(fragment)))
def testCanCopeWithCommasAndDashesInBackTickNames(self):
fragment = u'fun `wo,-bble`'
tokens = [
(Keyword, u'fun'),
(Text, u' '),
(Name.Function, u'`wo,-bble`'),
(Text, u'\n')
]
self.assertEqual(tokens, list(self.lexer.get_tokens(fragment)))
def testCanCopeWithDestructuring(self):
fragment = u'val (a, b) = '
tokens = [
(Keyword, u'val'),
(Text, u' '),
(Punctuation, u'('),
(Name.Property, u'a'),
(Punctuation, u','),
(Text, u' '),
(Name.Property, u'b'),
(Punctuation, u')'),
(Text, u' '),
(Punctuation, u'='),
(Text, u' '),
(Text, u'\n')
]
self.assertEqual(tokens, list(self.lexer.get_tokens(fragment)))
def testCanCopeGenericsInDestructuring(self):
fragment = u'val (a: List<Something>, b: Set<Wobble>) ='
tokens = [
(Keyword, u'val'),
(Text, u' '),
(Punctuation, u'('),
(Name.Property, u'a'),
(Punctuation, u':'),
(Text, u' '),
(Name.Property, u'List'),
(Punctuation, u'<'),
(Name, u'Something'),
(Punctuation, u'>'),
(Punctuation, u','),
(Text, u' '),
(Name.Property, u'b'),
(Punctuation, u':'),
(Text, u' '),
(Name.Property, u'Set'),
(Punctuation, u'<'),
(Name, u'Wobble'),
(Punctuation, u'>'),
(Punctuation, u')'),
(Text, u' '),
(Punctuation, u'='),
(Text, u'\n')
]
self.assertEqual(tokens, list(self.lexer.get_tokens(fragment)))
def testCanCopeWithGenerics(self):
fragment = u'inline fun <reified T : ContractState> VaultService.queryBy(): Vault.Page<T> {'
tokens = [
(Keyword, u'inline fun'),
(Text, u' '),
(Punctuation, u'<'),
(Keyword, u'reified'),
(Text, u' '),
(Name, u'T'),
(Text, u' '),
(Punctuation, u':'),
(Text, u' '),
(Name, u'ContractState'),
(Punctuation, u'>'),
(Text, u' '),
(Name.Class, u'VaultService'),
(Punctuation, u'.'),
(Name.Function, u'queryBy'),
(Punctuation, u'('),
(Punctuation, u')'),
(Punctuation, u':'),
(Text, u' '),
(Name, u'Vault'),
(Punctuation, u'.'),
(Name, u'Page'),
(Punctuation, u'<'),
(Name, u'T'),
(Punctuation, u'>'),
(Text, u' '),
(Punctuation, u'{'),
(Text, u'\n')
]
self.assertEqual(tokens, list(self.lexer.get_tokens(fragment)))
def testShouldCopeWithMultilineComments(self):
fragment = u'"""\nthis\nis\na\ncomment"""'
tokens = [
(String, u'"""\nthis\nis\na\ncomment"""'),
(Text, u'\n')
]
self.assertEqual(tokens, list(self.lexer.get_tokens(fragment)))
if __name__ == '__main__':
unittest.main()