Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rollup of 9 pull requests #36491

Merged
merged 38 commits into from
Sep 15, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
2a2c9d3
Improve shallow `Clone` deriving
petrochenkov Aug 26, 2016
62cb751
Improve `Eq` deriving
petrochenkov Aug 26, 2016
6b99e01
Delete stray ` character in error message.
solson Sep 11, 2016
0a62676
fix "X is not a member of trait Y" span labels
durka Sep 12, 2016
50f94f6
Avoid needless reexpansions.
jseyfried Sep 2, 2016
60440b2
Refactor `noop_fold_stmt_kind` out of `noop_fold_stmt`.
jseyfried Sep 4, 2016
a9821e1
Refactor `ExtCtxt` to use a `Resolver` instead of a `MacroLoader`.
jseyfried Sep 5, 2016
20b43b2
Rewrite the unit tests in `ext/expand.rs` as a `compile-fail` test.
jseyfried Sep 8, 2016
72a6369
Move macro resolution into `librustc_resolve`.
jseyfried Sep 7, 2016
c86c8d4
Perform node id assignment and `macros_at_scope` construction during
jseyfried Sep 5, 2016
f3c2dca
Remove scope placeholders from the crate root.
jseyfried Sep 6, 2016
78c0039
Expand generated test harnesses and macro registries.
jseyfried Sep 6, 2016
b54e1e3
Differentiate between monotonic and non-monotonic expansion and
jseyfried Sep 6, 2016
5a881e9
Make sure that projection bounds in ty::TraitObject are sorted in a w…
michaelwoerister Sep 12, 2016
5c923f0
Remove redundant sorting of projection bounds in tyencode.
michaelwoerister Sep 12, 2016
94d7501
Remove redundant sorting of projections in TypeIdHasher.
michaelwoerister Sep 13, 2016
75a0dd0
Make TypeIdHasher use DefPath::deterministic_hash() for stability.
michaelwoerister Sep 13, 2016
869d144
TypeIdHasher: Let projections be hashed implicitly by the visitor.
michaelwoerister Sep 13, 2016
377c3e1
Fix rebasing fallout.
michaelwoerister Sep 13, 2016
7ec9b81
TypeIdHasher: Remove more redundant explicit visit calls.
michaelwoerister Sep 13, 2016
b49a26e
invoke drop glue with a ptr to (data, meta)
nikomatsakis Sep 13, 2016
693676d
add missing test
nikomatsakis Sep 13, 2016
606cded
Add checked operation methods to Duration
Sep 14, 2016
07b41b5
Fix Duration::checked_mul documentation
Sep 14, 2016
b1bcd18
Implement add, sub, mul and div methods using checked methods for Dur…
Sep 14, 2016
6353e30
clear obligations-added flag with nested fulfillcx
nikomatsakis Sep 13, 2016
a4ee9c6
core: Use primitive indexing in slice's Index/IndexMut
bluss Sep 13, 2016
f2eb4f1
Fix doc-tests for Duration
Sep 14, 2016
b6321bd
Add feature crate attribute for duration_checked_ops to docs
Sep 15, 2016
7268501
Rollup merge of #36384 - petrochenkov:derclone, r=alexcrichton
Manishearth Sep 15, 2016
ebef6ad
Rollup merge of #36405 - solson:typo, r=eddyb
Manishearth Sep 15, 2016
7494bc7
Rollup merge of #36425 - michaelwoerister:stable-projection-bounds, r…
Manishearth Sep 15, 2016
23e0c24
Rollup merge of #36429 - durka:patch-30, r=nagisa
Manishearth Sep 15, 2016
bab9238
Rollup merge of #36438 - jseyfried:node_ids_in_expansion, r=nrc
Manishearth Sep 15, 2016
69a7f92
Rollup merge of #36454 - bluss:slice-primitive-index, r=alexcrichton
Manishearth Sep 15, 2016
959f764
Rollup merge of #36459 - nikomatsakis:issue-35546, r=eddyb
Manishearth Sep 15, 2016
0c9dc53
Rollup merge of #36461 - nikomatsakis:issue-36053, r=arielb1
Manishearth Sep 15, 2016
ec08128
Rollup merge of #36463 - eugene-bulkin:duration-checked-ops, r=alexcr…
Manishearth Sep 15, 2016
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 16 additions & 3 deletions src/libcore/clone.rs
Original file line number Diff line number Diff line change
Expand Up @@ -113,10 +113,23 @@ pub trait Clone : Sized {
}
}

// FIXME(aburka): this method is used solely by #[derive] to
// assert that every component of a type implements Clone.
// FIXME(aburka): these structs are used solely by #[derive] to
// assert that every component of a type implements Clone or Copy.
//
// This should never be called by user code.
// These structs should never appear in user code.
#[doc(hidden)]
#[allow(missing_debug_implementations)]
#[unstable(feature = "derive_clone_copy",
reason = "deriving hack, should not be public",
issue = "0")]
pub struct AssertParamIsClone<T: Clone + ?Sized> { _field: ::marker::PhantomData<T> }
#[doc(hidden)]
#[allow(missing_debug_implementations)]
#[unstable(feature = "derive_clone_copy",
reason = "deriving hack, should not be public",
issue = "0")]
pub struct AssertParamIsCopy<T: Copy + ?Sized> { _field: ::marker::PhantomData<T> }
#[cfg(stage0)]
#[doc(hidden)]
#[inline(always)]
#[unstable(feature = "derive_clone_copy",
Expand Down
13 changes: 12 additions & 1 deletion src/libcore/cmp.rs
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ pub trait PartialEq<Rhs: ?Sized = Self> {
/// This trait can be used with `#[derive]`. When `derive`d, because `Eq` has
/// no extra methods, it is only informing the compiler that this is an
/// equivalence relation rather than a partial equivalence relation. Note that
/// the `derive` strategy requires all fields are `PartialEq`, which isn't
/// the `derive` strategy requires all fields are `Eq`, which isn't
/// always desired.
///
/// ## How can I implement `Eq`?
Expand Down Expand Up @@ -165,6 +165,17 @@ pub trait Eq: PartialEq<Self> {
fn assert_receiver_is_total_eq(&self) {}
}

// FIXME: this struct is used solely by #[derive] to
// assert that every component of a type implements Eq.
//
// This struct should never appear in user code.
#[doc(hidden)]
#[allow(missing_debug_implementations)]
#[unstable(feature = "derive_eq",
reason = "deriving hack, should not be public",
issue = "0")]
pub struct AssertParamIsEq<T: Eq + ?Sized> { _field: ::marker::PhantomData<T> }

/// An `Ordering` is the result of a comparison between two values.
///
/// # Examples
Expand Down
8 changes: 4 additions & 4 deletions src/libcore/slice.rs
Original file line number Diff line number Diff line change
Expand Up @@ -520,8 +520,8 @@ impl<T> ops::Index<usize> for [T] {
type Output = T;

fn index(&self, index: usize) -> &T {
assert!(index < self.len());
unsafe { self.get_unchecked(index) }
// NB built-in indexing
&(*self)[index]
}
}

Expand All @@ -530,8 +530,8 @@ impl<T> ops::Index<usize> for [T] {
impl<T> ops::IndexMut<usize> for [T] {
#[inline]
fn index_mut(&mut self, index: usize) -> &mut T {
assert!(index < self.len());
unsafe { self.get_unchecked_mut(index) }
// NB built-in indexing
&mut (*self)[index]
}
}

Expand Down
27 changes: 27 additions & 0 deletions src/librustc/infer/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -830,6 +830,33 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
result.map(move |t| InferOk { value: t, obligations: fields.obligations })
}

// Clear the "obligations in snapshot" flag, invoke the closure,
// then restore the flag to its original value. This flag is a
// debugging measure designed to detect cases where we start a
// snapshot, create type variables, register obligations involving
// those type variables in the fulfillment cx, and then have to
// unroll the snapshot, leaving "dangling type variables" behind.
// In such cases, the flag will be set by the fulfillment cx, and
// an assertion will fail when rolling the snapshot back. Very
// useful, much better than grovelling through megabytes of
// RUST_LOG output.
//
// HOWEVER, in some cases the flag is wrong. In particular, we
// sometimes create a "mini-fulfilment-cx" in which we enroll
// obligations. As long as this fulfillment cx is fully drained
// before we return, this is not a problem, as there won't be any
// escaping obligations in the main cx. In those cases, you can
// use this function.
pub fn save_and_restore_obligations_in_snapshot_flag<F, R>(&self, func: F) -> R
where F: FnOnce(&Self) -> R
{
let flag = self.obligations_in_snapshot.get();
self.obligations_in_snapshot.set(false);
let result = func(self);
self.obligations_in_snapshot.set(flag);
result
}

fn start_snapshot(&self) -> CombinedSnapshot {
debug!("start_snapshot()");

Expand Down
4 changes: 4 additions & 0 deletions src/librustc/middle/cstore.rs
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ use std::rc::Rc;
use std::path::PathBuf;
use syntax::ast;
use syntax::attr;
use syntax::ext::base::LoadedMacro;
use syntax::ptr::P;
use syntax::parse::token::InternedString;
use syntax_pos::Span;
Expand Down Expand Up @@ -492,6 +493,9 @@ impl<'tcx> CrateStore<'tcx> for DummyCrateStore {
fn metadata_encoding_version(&self) -> &[u8] { bug!("metadata_encoding_version") }
}

pub trait MacroLoader {
fn load_crate(&mut self, extern_crate: &ast::Item, allows_macros: bool) -> Vec<LoadedMacro>;
}

/// Metadata encoding and decoding can make use of thread-local encoding and
/// decoding contexts. These allow implementers of serialize::Encodable and
Expand Down
9 changes: 2 additions & 7 deletions src/librustc/session/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ use util::nodemap::{NodeMap, FnvHashMap};
use util::common::duration_to_secs_str;
use mir::transform as mir_pass;

use syntax::ast::{NodeId, Name};
use syntax::ast::NodeId;
use errors::{self, DiagnosticBuilder};
use errors::emitter::{Emitter, EmitterWriter};
use syntax::json::JsonEmitter;
Expand All @@ -39,7 +39,7 @@ use llvm;

use std::path::{Path, PathBuf};
use std::cell::{self, Cell, RefCell};
use std::collections::{HashMap, HashSet};
use std::collections::HashMap;
use std::env;
use std::ffi::CString;
use std::rc::Rc;
Expand Down Expand Up @@ -96,10 +96,6 @@ pub struct Session {
pub injected_allocator: Cell<Option<ast::CrateNum>>,
pub injected_panic_runtime: Cell<Option<ast::CrateNum>>,

/// Names of all bang-style macros and syntax extensions
/// available in this crate
pub available_macros: RefCell<HashSet<Name>>,

/// Map from imported macro spans (which consist of
/// the localized span for the macro body) to the
/// macro name and defintion span in the source crate.
Expand Down Expand Up @@ -552,7 +548,6 @@ pub fn build_session_(sopts: config::Options,
next_node_id: Cell::new(1),
injected_allocator: Cell::new(None),
injected_panic_runtime: Cell::new(None),
available_macros: RefCell::new(HashSet::new()),
imported_macro_spans: RefCell::new(HashMap::new()),
incr_comp_session: RefCell::new(IncrCompSession::NotInitialized),
perf_stats: PerfStats {
Expand Down
46 changes: 24 additions & 22 deletions src/librustc/traits/specialize/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -203,32 +203,34 @@ fn fulfill_implication<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
// attempt to prove all of the predicates for impl2 given those for impl1
// (which are packed up in penv)

let mut fulfill_cx = FulfillmentContext::new();
for oblig in obligations.into_iter() {
fulfill_cx.register_predicate_obligation(&infcx, oblig);
}
match fulfill_cx.select_all_or_error(infcx) {
Err(errors) => {
// no dice!
debug!("fulfill_implication: for impls on {:?} and {:?}, could not fulfill: {:?} given \
{:?}",
source_trait_ref,
target_trait_ref,
errors,
infcx.parameter_environment.caller_bounds);
Err(())
infcx.save_and_restore_obligations_in_snapshot_flag(|infcx| {
let mut fulfill_cx = FulfillmentContext::new();
for oblig in obligations.into_iter() {
fulfill_cx.register_predicate_obligation(&infcx, oblig);
}
match fulfill_cx.select_all_or_error(infcx) {
Err(errors) => {
// no dice!
debug!("fulfill_implication: for impls on {:?} and {:?}, \
could not fulfill: {:?} given {:?}",
source_trait_ref,
target_trait_ref,
errors,
infcx.parameter_environment.caller_bounds);
Err(())
}

Ok(()) => {
debug!("fulfill_implication: an impl for {:?} specializes {:?}",
source_trait_ref,
target_trait_ref);
Ok(()) => {
debug!("fulfill_implication: an impl for {:?} specializes {:?}",
source_trait_ref,
target_trait_ref);

// Now resolve the *substitution* we built for the target earlier, replacing
// the inference variables inside with whatever we got from fulfillment.
Ok(infcx.resolve_type_vars_if_possible(&target_substs))
// Now resolve the *substitution* we built for the target earlier, replacing
// the inference variables inside with whatever we got from fulfillment.
Ok(infcx.resolve_type_vars_if_possible(&target_substs))
}
}
}
})
}

pub struct SpecializesCache {
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/ty/context.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1303,7 +1303,7 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
}

pub fn mk_trait(self, mut obj: TraitObject<'tcx>) -> Ty<'tcx> {
obj.projection_bounds.sort_by(|a, b| a.sort_key().cmp(&b.sort_key()));
obj.projection_bounds.sort_by_key(|b| b.sort_key(self));
self.mk_ty(TyTrait(box obj))
}

Expand Down
4 changes: 0 additions & 4 deletions src/librustc/ty/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1018,10 +1018,6 @@ impl<'tcx> PolyProjectionPredicate<'tcx> {
pub fn item_name(&self) -> Name {
self.0.projection_ty.item_name // safe to skip the binder to access a name
}

pub fn sort_key(&self) -> (DefId, Name) {
self.0.projection_ty.sort_key()
}
}

pub trait ToPolyTraitRef<'tcx> {
Expand Down
21 changes: 12 additions & 9 deletions src/librustc/ty/sty.rs
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ use std::mem;
use std::ops;
use syntax::abi;
use syntax::ast::{self, Name};
use syntax::parse::token::keywords;
use syntax::parse::token::{keywords, InternedString};

use serialize::{Decodable, Decoder, Encodable, Encoder};

Expand Down Expand Up @@ -440,12 +440,6 @@ pub struct ProjectionTy<'tcx> {
pub item_name: Name,
}

impl<'tcx> ProjectionTy<'tcx> {
pub fn sort_key(&self) -> (DefId, Name) {
(self.trait_ref.def_id, self.item_name)
}
}

#[derive(Clone, PartialEq, Eq, Hash, Debug)]
pub struct BareFnTy<'tcx> {
pub unsafety: hir::Unsafety,
Expand Down Expand Up @@ -738,8 +732,17 @@ impl<'a, 'tcx, 'gcx> PolyExistentialProjection<'tcx> {
self.0.item_name // safe to skip the binder to access a name
}

pub fn sort_key(&self) -> (DefId, Name) {
(self.0.trait_ref.def_id, self.0.item_name)
pub fn sort_key(&self, tcx: TyCtxt<'a, 'gcx, 'tcx>) -> (u64, InternedString) {
// We want something here that is stable across crate boundaries.
// The DefId isn't but the `deterministic_hash` of the corresponding
// DefPath is.
let trait_def = tcx.lookup_trait_def(self.0.trait_ref.def_id);
let def_path_hash = trait_def.def_path_hash;

// An `ast::Name` is also not stable (it's just an index into an
// interning table), so map to the corresponding `InternedString`.
let item_name = self.0.item_name.as_str();
(def_path_hash, item_name)
}

pub fn with_self_ty(&self, tcx: TyCtxt<'a, 'gcx, 'tcx>,
Expand Down
10 changes: 8 additions & 2 deletions src/librustc/ty/trait_def.rs
Original file line number Diff line number Diff line change
Expand Up @@ -70,15 +70,20 @@ pub struct TraitDef<'tcx> {
pub specialization_graph: RefCell<traits::specialization_graph::Graph>,

/// Various flags
pub flags: Cell<TraitFlags>
pub flags: Cell<TraitFlags>,

/// The ICH of this trait's DefPath, cached here so it doesn't have to be
/// recomputed all the time.
pub def_path_hash: u64,
}

impl<'a, 'gcx, 'tcx> TraitDef<'tcx> {
pub fn new(unsafety: hir::Unsafety,
paren_sugar: bool,
generics: &'tcx ty::Generics<'tcx>,
trait_ref: ty::TraitRef<'tcx>,
associated_type_names: Vec<Name>)
associated_type_names: Vec<Name>,
def_path_hash: u64)
-> TraitDef<'tcx> {
TraitDef {
paren_sugar: paren_sugar,
Expand All @@ -90,6 +95,7 @@ impl<'a, 'gcx, 'tcx> TraitDef<'tcx> {
blanket_impls: RefCell::new(vec![]),
flags: Cell::new(ty::TraitFlags::NO_TRAIT_FLAGS),
specialization_graph: RefCell::new(traits::specialization_graph::Graph::new()),
def_path_hash: def_path_hash,
}
}

Expand Down
39 changes: 5 additions & 34 deletions src/librustc/ty/util.rs
Original file line number Diff line number Diff line change
Expand Up @@ -411,15 +411,11 @@ impl<'a, 'gcx, 'tcx> TypeIdHasher<'a, 'gcx, 'tcx> {
}

fn def_id(&mut self, did: DefId) {
// Hash the crate identification information.
let name = self.tcx.crate_name(did.krate);
let disambiguator = self.tcx.crate_disambiguator(did.krate);
self.hash((name, disambiguator));

// Hash the item path within that crate.
// FIXME(#35379) This should use a deterministic
// DefPath hashing mechanism, not the DefIndex.
self.hash(did.index);
// Hash the DefPath corresponding to the DefId, which is independent
// of compiler internal state.
let tcx = self.tcx;
let def_path = tcx.def_path(did);
def_path.deterministic_hash_to(tcx, &mut self.state);
}
}

Expand All @@ -445,33 +441,8 @@ impl<'a, 'gcx, 'tcx> TypeVisitor<'tcx> for TypeIdHasher<'a, 'gcx, 'tcx> {
self.hash(f.sig.variadic());
}
TyTrait(ref data) => {
// Trait objects have a list of projection bounds
// that are not guaranteed to be sorted in an order
// that gets preserved across crates, so we need
// to sort them again by the name, in string form.

// Hash the whole principal trait ref.
self.def_id(data.principal.def_id());
data.principal.visit_with(self);

// Hash region and builtin bounds.
data.region_bound.visit_with(self);
self.hash(data.builtin_bounds);

// Only projection bounds are left, sort and hash them.
let mut projection_bounds: Vec<_> = data.projection_bounds
.iter()
.map(|b| (b.item_name().as_str(), b))
.collect();
projection_bounds.sort_by_key(|&(ref name, _)| name.clone());
for (name, bound) in projection_bounds {
self.def_id(bound.0.trait_ref.def_id);
self.hash(name);
bound.visit_with(self);
}

// Bypass super_visit_with, we've visited everything.
return false;
}
TyTuple(tys) => {
self.hash(tys.len());
Expand Down
Loading