Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rollup of 16 pull requests #58669

Merged
merged 65 commits into from
Feb 23, 2019
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
65 commits
Select commit Hold shift + click to select a range
eb5b096
RangeInclusive internal iteration performance improvement.
matthieu-m Feb 3, 2019
a15916b
[WIP] add better error message for partial move
clintfred Feb 5, 2019
50be479
Updated RELEASES.md for 1.33.0
Feb 6, 2019
d4c52bf
error output updated by ./x.py test --stage 1 src/test/ui --increment…
clintfred Feb 6, 2019
6c71e7d
Update RELEASES.md
Aaronepower Feb 7, 2019
fb3ae57
Update RELEASES.md
mark-i-m Feb 9, 2019
4fed67f
Fix exhaustion of inclusive range try_fold and try_rfold
matthieu-m Feb 9, 2019
73921f6
Update RELEASES.md
Centril Feb 12, 2019
8a026f1
Update RELEASES.md
Centril Feb 12, 2019
4c0a3d5
Update RELEASES.md
Centril Feb 12, 2019
ee3371e
Update RELEASES.md
Centril Feb 12, 2019
4e5eda3
compute is_partial_move outside of the move_site loop for clarity
clintfred Feb 13, 2019
a496450
Update RELEASES.md
Centril Feb 13, 2019
96fd218
check if `used_place` and `moved_place` are equal when determining if…
clintfred Feb 13, 2019
755b320
simplified conditional
clintfred Feb 13, 2019
283ffcf
Check the self-type of inherent associated constants
matthewjasper Feb 11, 2019
347a42e
SGX target: fix panic = abort
Feb 14, 2019
c34aac7
help suggestion when trying to delimit string literals with directed …
pmccarter Feb 17, 2019
71cd4c8
ui test for directed quote help suggestion #58436
pmccarter Feb 17, 2019
d26bf74
Change `Token::interpolated_to_tokenstream()`.
nnethercote Feb 14, 2019
f8801f3
Remove `LazyTokenStream`.
nnethercote Feb 14, 2019
f0d8fbd
Avoid a `clone()` in `transcribe()`.
nnethercote Feb 15, 2019
82ad4f1
Make `interpolated_to_tokenstream` a method on `Nonterminal`.
nnethercote Feb 17, 2019
895a794
Remove some unnecessary `into()` calls.
nnethercote Feb 17, 2019
de05548
re-blessing error output: ./x.py test src/test/ui --stage 1 --bless
clintfred Feb 18, 2019
8e219e7
Turn duration consts into associated consts
Feb 20, 2019
2621564
Update RELEASES.md
Aaronepower Feb 20, 2019
d072510
Update RELEASES.md
Aaronepower Feb 20, 2019
02fe6a7
./x.py test src/test/ui --stage 1 --bless -i --compare-mode=nll
clintfred Feb 20, 2019
f223c03
Add examples for duration constants
Feb 20, 2019
36f18f2
Allow Self::Module to be mutated.
gabi-250 Feb 20, 2019
42d749c
Update RELEASES.md
tspiteri Feb 20, 2019
c6d24cd
Enable feature duration_constants in examples
Feb 21, 2019
8060eb4
Update RELEASES.md
Aaronepower Feb 21, 2019
0ab2aed
Update RELEASES.md
Aaronepower Feb 21, 2019
e5d1fa5
codegen and write_metadata can mutate ModuleLLvm.
gabi-250 Feb 21, 2019
9f58c5f
Optimise vec![false; N] to zero-alloc
RReverser Feb 21, 2019
b5ae4d5
Don't generate minification variable if minification disabled
GuillaumeGomez Feb 22, 2019
e555854
Make target pointer-width specific variants of (very old) huge-array-…
pnkfelix Feb 22, 2019
b72ba05
Switch from error patterns to `//~ ERROR` markers.
pnkfelix Feb 22, 2019
fda51c2
Update RELEASES.md
Centril Feb 22, 2019
cc1cd83
Do not underflow after resetting unmatched braces count
estebank Feb 22, 2019
5f27a25
Invalid byte alignment expected/provided in message #58617
pmccarter Feb 22, 2019
8ee1c07
Change byte align message wording #58617
pmccarter Feb 22, 2019
5952c61
tidy line length override #58617
pmccarter Feb 22, 2019
d0c110f
#58658 bless after line split for tidy
pmccarter Feb 23, 2019
42d5cf8
reduce an mir code repetition like (n << amt) >> amt
kenta7777 Feb 23, 2019
1932d7a
Transition librustdoc to 2018 edition
h-michael Feb 23, 2019
1fe87df
Fix tidy check errors
h-michael Feb 23, 2019
c49da5b
Rollup merge of #58100 - h-michael:librustdoc-2018, r=Centril
Centril Feb 23, 2019
f19bec8
Rollup merge of #58122 - matthieu-m:range_incl_perf, r=dtolnay
Centril Feb 23, 2019
1d6657d
Rollup merge of #58199 - clintfred:partial-move-err-msg, r=estebank
Centril Feb 23, 2019
abd6f50
Rollup merge of #58227 - Aaronepower:master, r=Centril
Centril Feb 23, 2019
3688643
Rollup merge of #58353 - matthewjasper:typeck-pattern-constants, r=ar…
Centril Feb 23, 2019
4f99061
Rollup merge of #58453 - jethrogb:jb/sgx-panic-abort, r=nagisa
Centril Feb 23, 2019
585d4d2
Rollup merge of #58476 - nnethercote:rm-LazyTokenStream, r=petrochenkov
Centril Feb 23, 2019
18dd2d2
Rollup merge of #58526 - pmccarter:master, r=estebank
Centril Feb 23, 2019
73e661a
Rollup merge of #58595 - stjepang:make-duration-consts-associated, r=…
Centril Feb 23, 2019
8ccda24
Rollup merge of #58609 - gabi-250:mutable-refs, r=oli-obk
Centril Feb 23, 2019
93bfa92
Rollup merge of #58628 - RReverser:optimise-vec-false, r=oli-obk
Centril Feb 23, 2019
69cb908
Rollup merge of #58643 - GuillaumeGomez:extra-variables, r=Manishearth
Centril Feb 23, 2019
c2ad75e
Rollup merge of #58648 - pnkfelix:issue-23926-update-tests, r=nikomat…
Centril Feb 23, 2019
2db0e48
Rollup merge of #58654 - estebank:underflow, r=nikomatsakis
Centril Feb 23, 2019
d038fb2
Rollup merge of #58658 - pmccarter:align_msg, r=matthewjasper
Centril Feb 23, 2019
a36d1b9
Rollup merge of #58667 - kenta7777:reduce-mir-code-repetition, r=petr…
Centril Feb 23, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Make interpolated_to_tokenstream a method on Nonterminal.
  • Loading branch information
nnethercote committed Feb 17, 2019
commit 82ad4f1f45a60995c0955e28bbed3885008e3ee5
2 changes: 1 addition & 1 deletion src/librustc/hir/lowering.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1132,7 +1132,7 @@ impl<'a> LoweringContext<'a> {
fn lower_token(&mut self, token: Token, span: Span) -> TokenStream {
match token {
Token::Interpolated(nt) => {
let tts = Token::interpolated_to_tokenstream(&self.sess.parse_sess, nt, span);
let tts = nt.to_tokenstream(&self.sess.parse_sess, span);
self.lower_token_stream(tts)
}
other => TokenTree::Token(span, other).into(),
Expand Down
163 changes: 81 additions & 82 deletions src/libsyntax/parse/token.rs
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ impl Lit {
}
}

// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
fn probably_equal_for_proc_macro(&self, other: &Lit) -> bool {
mem::discriminant(self) == mem::discriminant(other)
Expand Down Expand Up @@ -502,87 +502,7 @@ impl Token {
}
}

pub fn interpolated_to_tokenstream(sess: &ParseSess, nt: Lrc<Nonterminal>, span: Span)
-> TokenStream {
// An `Interpolated` token means that we have a `Nonterminal`
// which is often a parsed AST item. At this point we now need
// to convert the parsed AST to an actual token stream, e.g.
// un-parse it basically.
//
// Unfortunately there's not really a great way to do that in a
// guaranteed lossless fashion right now. The fallback here is
// to just stringify the AST node and reparse it, but this loses
// all span information.
//
// As a result, some AST nodes are annotated with the token
// stream they came from. Here we attempt to extract these
// lossless token streams before we fall back to the
// stringification.
let tokens = match *nt {
Nonterminal::NtItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtTraitItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtImplItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtIdent(ident, is_raw) => {
let token = Token::Ident(ident, is_raw);
Some(TokenTree::Token(ident.span, token).into())
}
Nonterminal::NtLifetime(ident) => {
let token = Token::Lifetime(ident);
Some(TokenTree::Token(ident.span, token).into())
}
Nonterminal::NtTT(ref tt) => {
Some(tt.clone().into())
}
_ => None,
};

// FIXME(#43081): Avoid this pretty-print + reparse hack
let source = pprust::nonterminal_to_string(&nt);
let filename = FileName::macro_expansion_source_code(&source);
let (tokens_for_real, errors) =
parse_stream_from_source_str(filename, source, sess, Some(span));
emit_unclosed_delims(&errors, &sess.span_diagnostic);

// During early phases of the compiler the AST could get modified
// directly (e.g., attributes added or removed) and the internal cache
// of tokens my not be invalidated or updated. Consequently if the
// "lossless" token stream disagrees with our actual stringification
// (which has historically been much more battle-tested) then we go
// with the lossy stream anyway (losing span information).
//
// Note that the comparison isn't `==` here to avoid comparing spans,
// but it *also* is a "probable" equality which is a pretty weird
// definition. We mostly want to catch actual changes to the AST
// like a `#[cfg]` being processed or some weird `macro_rules!`
// expansion.
//
// What we *don't* want to catch is the fact that a user-defined
// literal like `0xf` is stringified as `15`, causing the cached token
// stream to not be literal `==` token-wise (ignoring spans) to the
// token stream we got from stringification.
//
// Instead the "probably equal" check here is "does each token
// recursively have the same discriminant?" We basically don't look at
// the token values here and assume that such fine grained token stream
// modifications, including adding/removing typically non-semantic
// tokens such as extra braces and commas, don't happen.
if let Some(tokens) = tokens {
if tokens.probably_equal_for_proc_macro(&tokens_for_real) {
return tokens
}
info!("cached tokens found, but they're not \"probably equal\", \
going with stringified version");
}
return tokens_for_real
}

// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
crate fn probably_equal_for_proc_macro(&self, other: &Token) -> bool {
if mem::discriminant(self) != mem::discriminant(other) {
Expand Down Expand Up @@ -714,6 +634,85 @@ impl fmt::Debug for Nonterminal {
}
}

impl Nonterminal {
pub fn to_tokenstream(&self, sess: &ParseSess, span: Span) -> TokenStream {
// A `Nonterminal` is often a parsed AST item. At this point we now
// need to convert the parsed AST to an actual token stream, e.g.
// un-parse it basically.
//
// Unfortunately there's not really a great way to do that in a
// guaranteed lossless fashion right now. The fallback here is to just
// stringify the AST node and reparse it, but this loses all span
// information.
//
// As a result, some AST nodes are annotated with the token stream they
// came from. Here we attempt to extract these lossless token streams
// before we fall back to the stringification.
let tokens = match *self {
Nonterminal::NtItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtTraitItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtImplItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtIdent(ident, is_raw) => {
let token = Token::Ident(ident, is_raw);
Some(TokenTree::Token(ident.span, token).into())
}
Nonterminal::NtLifetime(ident) => {
let token = Token::Lifetime(ident);
Some(TokenTree::Token(ident.span, token).into())
}
Nonterminal::NtTT(ref tt) => {
Some(tt.clone().into())
}
_ => None,
};

// FIXME(#43081): Avoid this pretty-print + reparse hack
let source = pprust::nonterminal_to_string(self);
let filename = FileName::macro_expansion_source_code(&source);
let (tokens_for_real, errors) =
parse_stream_from_source_str(filename, source, sess, Some(span));
emit_unclosed_delims(&errors, &sess.span_diagnostic);

// During early phases of the compiler the AST could get modified
// directly (e.g., attributes added or removed) and the internal cache
// of tokens my not be invalidated or updated. Consequently if the
// "lossless" token stream disagrees with our actual stringification
// (which has historically been much more battle-tested) then we go
// with the lossy stream anyway (losing span information).
//
// Note that the comparison isn't `==` here to avoid comparing spans,
// but it *also* is a "probable" equality which is a pretty weird
// definition. We mostly want to catch actual changes to the AST
// like a `#[cfg]` being processed or some weird `macro_rules!`
// expansion.
//
// What we *don't* want to catch is the fact that a user-defined
// literal like `0xf` is stringified as `15`, causing the cached token
// stream to not be literal `==` token-wise (ignoring spans) to the
// token stream we got from stringification.
//
// Instead the "probably equal" check here is "does each token
// recursively have the same discriminant?" We basically don't look at
// the token values here and assume that such fine grained token stream
// modifications, including adding/removing typically non-semantic
// tokens such as extra braces and commas, don't happen.
if let Some(tokens) = tokens {
if tokens.probably_equal_for_proc_macro(&tokens_for_real) {
return tokens
}
info!("cached tokens found, but they're not \"probably equal\", \
going with stringified version");
}
return tokens_for_real
}
}

crate fn is_op(tok: &Token) -> bool {
match *tok {
OpenDelim(..) | CloseDelim(..) | Literal(..) | DocComment(..) |
Expand Down
4 changes: 2 additions & 2 deletions src/libsyntax/tokenstream.rs
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ impl TokenTree {
}
}

// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
//
// This is otherwise the same as `eq_unspanned`, only recursing with a
Expand Down Expand Up @@ -310,7 +310,7 @@ impl TokenStream {
t1.next().is_none() && t2.next().is_none()
}

// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
//
// This is otherwise the same as `eq_unspanned`, only recursing with a
Expand Down
2 changes: 1 addition & 1 deletion src/libsyntax_ext/proc_macro_server.rs
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ impl FromInternal<(TreeAndJoint, &'_ ParseSess, &'_ mut Vec<Self>)>
}

Interpolated(nt) => {
let stream = Token::interpolated_to_tokenstream(sess, nt, span);
let stream = nt.to_tokenstream(sess, span);
TokenTree::Group(Group {
delimiter: Delimiter::None,
stream,
Expand Down