]> git.proxmox.com Git - rustc.git/blame - src/librustc_trans/trans/_match.rs
Imported Upstream version 1.8.0+dfsg1
[rustc.git] / src / librustc_trans / trans / _match.rs
CommitLineData
1a4d82fc
JJ
1// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
2// file at the top-level directory of this distribution and at
3// http://rust-lang.org/COPYRIGHT.
4//
5// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
6// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
7// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
8// option. This file may not be copied, modified, or distributed
9// except according to those terms.
10
11//! # Compilation of match statements
12//!
13//! I will endeavor to explain the code as best I can. I have only a loose
14//! understanding of some parts of it.
15//!
16//! ## Matching
17//!
18//! The basic state of the code is maintained in an array `m` of `Match`
19//! objects. Each `Match` describes some list of patterns, all of which must
20//! match against the current list of values. If those patterns match, then
21//! the arm listed in the match is the correct arm. A given arm may have
22//! multiple corresponding match entries, one for each alternative that
23//! remains. As we proceed these sets of matches are adjusted by the various
24//! `enter_XXX()` functions, each of which adjusts the set of options given
25//! some information about the value which has been matched.
26//!
27//! So, initially, there is one value and N matches, each of which have one
28//! constituent pattern. N here is usually the number of arms but may be
29//! greater, if some arms have multiple alternatives. For example, here:
30//!
c34b1796 31//! enum Foo { A, B(int), C(usize, usize) }
1a4d82fc
JJ
32//! match foo {
33//! A => ...,
34//! B(x) => ...,
85aaf69f 35//! C(1, 2) => ...,
1a4d82fc
JJ
36//! C(_) => ...
37//! }
38//!
39//! The value would be `foo`. There would be four matches, each of which
40//! contains one pattern (and, in one case, a guard). We could collect the
41//! various options and then compile the code for the case where `foo` is an
42//! `A`, a `B`, and a `C`. When we generate the code for `C`, we would (1)
43//! drop the two matches that do not match a `C` and (2) expand the other two
85aaf69f 44//! into two patterns each. In the first case, the two patterns would be `1`
1a4d82fc
JJ
45//! and `2`, and the in the second case the _ pattern would be expanded into
46//! `_` and `_`. The two values are of course the arguments to `C`.
47//!
48//! Here is a quick guide to the various functions:
49//!
50//! - `compile_submatch()`: The main workhouse. It takes a list of values and
51//! a list of matches and finds the various possibilities that could occur.
52//!
53//! - `enter_XXX()`: modifies the list of matches based on some information
54//! about the value that has been matched. For example,
55//! `enter_rec_or_struct()` adjusts the values given that a record or struct
56//! has been matched. This is an infallible pattern, so *all* of the matches
57//! must be either wildcards or record/struct patterns. `enter_opt()`
58//! handles the fallible cases, and it is correspondingly more complex.
59//!
60//! ## Bindings
61//!
62//! We store information about the bound variables for each arm as part of the
63//! per-arm `ArmData` struct. There is a mapping from identifiers to
64//! `BindingInfo` structs. These structs contain the mode/id/type of the
65//! binding, but they also contain an LLVM value which points at an alloca
66//! called `llmatch`. For by value bindings that are Copy, we also create
67//! an extra alloca that we copy the matched value to so that any changes
68//! we do to our copy is not reflected in the original and vice-versa.
69//! We don't do this if it's a move since the original value can't be used
70//! and thus allowing us to cheat in not creating an extra alloca.
71//!
72//! The `llmatch` binding always stores a pointer into the value being matched
73//! which points at the data for the binding. If the value being matched has
74//! type `T`, then, `llmatch` will point at an alloca of type `T*` (and hence
75//! `llmatch` has type `T**`). So, if you have a pattern like:
76//!
77//! let a: A = ...;
78//! let b: B = ...;
79//! match (a, b) { (ref c, d) => { ... } }
80//!
81//! For `c` and `d`, we would generate allocas of type `C*` and `D*`
82//! respectively. These are called the `llmatch`. As we match, when we come
83//! up against an identifier, we store the current pointer into the
84//! corresponding alloca.
85//!
86//! Once a pattern is completely matched, and assuming that there is no guard
87//! pattern, we will branch to a block that leads to the body itself. For any
88//! by-value bindings, this block will first load the ptr from `llmatch` (the
89//! one of type `D*`) and then load a second time to get the actual value (the
90//! one of type `D`). For by ref bindings, the value of the local variable is
91//! simply the first alloca.
92//!
93//! So, for the example above, we would generate a setup kind of like this:
94//!
95//! +-------+
96//! | Entry |
97//! +-------+
98//! |
99//! +--------------------------------------------+
100//! | llmatch_c = (addr of first half of tuple) |
101//! | llmatch_d = (addr of second half of tuple) |
102//! +--------------------------------------------+
103//! |
104//! +--------------------------------------+
105//! | *llbinding_d = **llmatch_d |
106//! +--------------------------------------+
107//!
108//! If there is a guard, the situation is slightly different, because we must
109//! execute the guard code. Moreover, we need to do so once for each of the
110//! alternatives that lead to the arm, because if the guard fails, they may
111//! have different points from which to continue the search. Therefore, in that
112//! case, we generate code that looks more like:
113//!
114//! +-------+
115//! | Entry |
116//! +-------+
117//! |
118//! +-------------------------------------------+
119//! | llmatch_c = (addr of first half of tuple) |
120//! | llmatch_d = (addr of first half of tuple) |
121//! +-------------------------------------------+
122//! |
123//! +-------------------------------------------------+
124//! | *llbinding_d = **llmatch_d |
125//! | check condition |
126//! | if false { goto next case } |
127//! | if true { goto body } |
128//! +-------------------------------------------------+
129//!
130//! The handling for the cleanups is a bit... sensitive. Basically, the body
131//! is the one that invokes `add_clean()` for each binding. During the guard
132//! evaluation, we add temporary cleanups and revoke them after the guard is
133//! evaluated (it could fail, after all). Note that guards and moves are
134//! just plain incompatible.
135//!
136//! Some relevant helper functions that manage bindings:
137//! - `create_bindings_map()`
138//! - `insert_lllocals()`
139//!
140//!
141//! ## Notes on vector pattern matching.
142//!
143//! Vector pattern matching is surprisingly tricky. The problem is that
144//! the structure of the vector isn't fully known, and slice matches
145//! can be done on subparts of it.
146//!
147//! The way that vector pattern matches are dealt with, then, is as
148//! follows. First, we make the actual condition associated with a
149//! vector pattern simply a vector length comparison. So the pattern
150//! [1, .. x] gets the condition "vec len >= 1", and the pattern
151//! [.. x] gets the condition "vec len >= 0". The problem here is that
152//! having the condition "vec len >= 1" hold clearly does not mean that
153//! only a pattern that has exactly that condition will match. This
154//! means that it may well be the case that a condition holds, but none
155//! of the patterns matching that condition match; to deal with this,
156//! when doing vector length matches, we have match failures proceed to
157//! the next condition to check.
158//!
159//! There are a couple more subtleties to deal with. While the "actual"
160//! condition associated with vector length tests is simply a test on
161//! the vector length, the actual vec_len Opt entry contains more
162//! information used to restrict which matches are associated with it.
163//! So that all matches in a submatch are matching against the same
164//! values from inside the vector, they are split up by how many
165//! elements they match at the front and at the back of the vector. In
166//! order to make sure that arms are properly checked in order, even
167//! with the overmatching conditions, each vec_len Opt entry is
168//! associated with a range of matches.
169//! Consider the following:
170//!
171//! match &[1, 2, 3] {
172//! [1, 1, .. _] => 0,
173//! [1, 2, 2, .. _] => 1,
174//! [1, 2, 3, .. _] => 2,
175//! [1, 2, .. _] => 3,
176//! _ => 4
177//! }
178//! The proper arm to match is arm 2, but arms 0 and 3 both have the
179//! condition "len >= 2". If arm 3 was lumped in with arm 0, then the
180//! wrong branch would be taken. Instead, vec_len Opts are associated
181//! with a contiguous range of matches that have the same "shape".
182//! This is sort of ugly and requires a bunch of special handling of
183//! vec_len options.
184
185pub use self::BranchKind::*;
186pub use self::OptResult::*;
187pub use self::TransBindingMode::*;
188use self::Opt::*;
189use self::FailureHandler::*;
190
1a4d82fc
JJ
191use llvm::{ValueRef, BasicBlockRef};
192use middle::check_match::StaticInliner;
193use middle::check_match;
194use middle::const_eval;
7453a54e 195use middle::def::{Def, DefMap};
e9174d1e 196use middle::def_id::DefId;
1a4d82fc 197use middle::expr_use_visitor as euv;
c1a9b12d 198use middle::infer;
1a4d82fc
JJ
199use middle::lang_items::StrEqFnLangItem;
200use middle::mem_categorization as mc;
92a42be0 201use middle::mem_categorization::Categorization;
1a4d82fc
JJ
202use middle::pat_util::*;
203use trans::adt;
204use trans::base::*;
c34b1796 205use trans::build::{AddCase, And, Br, CondBr, GEPi, InBoundsGEP, Load, PointerCast};
85aaf69f 206use trans::build::{Not, Store, Sub, add_comment};
1a4d82fc
JJ
207use trans::build;
208use trans::callee;
c1a9b12d 209use trans::cleanup::{self, CleanupMethods, DropHintMethods};
1a4d82fc
JJ
210use trans::common::*;
211use trans::consts;
212use trans::datum::*;
85aaf69f 213use trans::debuginfo::{self, DebugLoc, ToDebugLoc};
1a4d82fc 214use trans::expr::{self, Dest};
d9579d0f 215use trans::monomorphize;
1a4d82fc
JJ
216use trans::tvec;
217use trans::type_of;
9cc50fc6 218use trans::Disr;
1a4d82fc 219use middle::ty::{self, Ty};
c1a9b12d 220use session::config::NoDebugInfo;
1a4d82fc
JJ
221use util::common::indenter;
222use util::nodemap::FnvHashMap;
62682a34 223use util::ppaux;
1a4d82fc
JJ
224
225use std;
92a42be0 226use std::cell::RefCell;
85aaf69f 227use std::cmp::Ordering;
62682a34 228use std::fmt;
1a4d82fc 229use std::rc::Rc;
7453a54e 230use rustc_front::hir::{self, PatKind};
e9174d1e 231use syntax::ast::{self, DUMMY_NODE_ID, NodeId};
1a4d82fc 232use syntax::codemap::Span;
e9174d1e 233use rustc_front::fold::Folder;
1a4d82fc
JJ
234use syntax::ptr::P;
235
c34b1796 236#[derive(Copy, Clone, Debug)]
e9174d1e 237struct ConstantExpr<'a>(&'a hir::Expr);
1a4d82fc
JJ
238
239impl<'a> ConstantExpr<'a> {
240 fn eq(self, other: ConstantExpr<'a>, tcx: &ty::ctxt) -> bool {
c1a9b12d 241 match const_eval::compare_lit_exprs(tcx, self.0, other.0) {
85aaf69f 242 Some(result) => result == Ordering::Equal,
1a4d82fc
JJ
243 None => panic!("compare_list_exprs: type mismatch"),
244 }
245 }
246}
247
248// An option identifying a branch (either a literal, an enum variant or a range)
85aaf69f 249#[derive(Debug)]
1a4d82fc 250enum Opt<'a, 'tcx> {
85aaf69f
SL
251 ConstantValue(ConstantExpr<'a>, DebugLoc),
252 ConstantRange(ConstantExpr<'a>, ConstantExpr<'a>, DebugLoc),
9cc50fc6 253 Variant(Disr, Rc<adt::Repr<'tcx>>, DefId, DebugLoc),
c34b1796
AL
254 SliceLengthEqual(usize, DebugLoc),
255 SliceLengthGreaterOrEqual(/* prefix length */ usize,
256 /* suffix length */ usize,
85aaf69f 257 DebugLoc),
1a4d82fc
JJ
258}
259
260impl<'a, 'tcx> Opt<'a, 'tcx> {
261 fn eq(&self, other: &Opt<'a, 'tcx>, tcx: &ty::ctxt<'tcx>) -> bool {
262 match (self, other) {
85aaf69f
SL
263 (&ConstantValue(a, _), &ConstantValue(b, _)) => a.eq(b, tcx),
264 (&ConstantRange(a1, a2, _), &ConstantRange(b1, b2, _)) => {
1a4d82fc
JJ
265 a1.eq(b1, tcx) && a2.eq(b2, tcx)
266 }
85aaf69f
SL
267 (&Variant(a_disr, ref a_repr, a_def, _),
268 &Variant(b_disr, ref b_repr, b_def, _)) => {
1a4d82fc
JJ
269 a_disr == b_disr && *a_repr == *b_repr && a_def == b_def
270 }
85aaf69f
SL
271 (&SliceLengthEqual(a, _), &SliceLengthEqual(b, _)) => a == b,
272 (&SliceLengthGreaterOrEqual(a1, a2, _),
273 &SliceLengthGreaterOrEqual(b1, b2, _)) => {
1a4d82fc
JJ
274 a1 == b1 && a2 == b2
275 }
276 _ => false
277 }
278 }
279
280 fn trans<'blk>(&self, mut bcx: Block<'blk, 'tcx>) -> OptResult<'blk, 'tcx> {
b039eaaf 281 use trans::consts::TrueConst::Yes;
1a4d82fc
JJ
282 let _icx = push_ctxt("match::trans_opt");
283 let ccx = bcx.ccx();
284 match *self {
85aaf69f 285 ConstantValue(ConstantExpr(lit_expr), _) => {
c1a9b12d 286 let lit_ty = bcx.tcx().node_id_to_type(lit_expr.id);
7453a54e 287 let expr = consts::const_expr(ccx, &lit_expr, bcx.fcx.param_substs, None, Yes);
b039eaaf
SL
288 let llval = match expr {
289 Ok((llval, _)) => llval,
290 Err(err) => bcx.ccx().sess().span_fatal(lit_expr.span, &err.description()),
291 };
1a4d82fc
JJ
292 let lit_datum = immediate_rvalue(llval, lit_ty);
293 let lit_datum = unpack_datum!(bcx, lit_datum.to_appropriate_datum(bcx));
294 SingleResult(Result::new(bcx, lit_datum.val))
295 }
85aaf69f 296 ConstantRange(ConstantExpr(ref l1), ConstantExpr(ref l2), _) => {
7453a54e 297 let l1 = match consts::const_expr(ccx, &l1, bcx.fcx.param_substs, None, Yes) {
b039eaaf
SL
298 Ok((l1, _)) => l1,
299 Err(err) => bcx.ccx().sess().span_fatal(l1.span, &err.description()),
300 };
7453a54e 301 let l2 = match consts::const_expr(ccx, &l2, bcx.fcx.param_substs, None, Yes) {
b039eaaf
SL
302 Ok((l2, _)) => l2,
303 Err(err) => bcx.ccx().sess().span_fatal(l2.span, &err.description()),
304 };
1a4d82fc
JJ
305 RangeResult(Result::new(bcx, l1), Result::new(bcx, l2))
306 }
85aaf69f 307 Variant(disr_val, ref repr, _, _) => {
7453a54e 308 SingleResult(Result::new(bcx, adt::trans_case(bcx, &repr, disr_val)))
1a4d82fc 309 }
85aaf69f 310 SliceLengthEqual(length, _) => {
1a4d82fc
JJ
311 SingleResult(Result::new(bcx, C_uint(ccx, length)))
312 }
85aaf69f 313 SliceLengthGreaterOrEqual(prefix, suffix, _) => {
1a4d82fc
JJ
314 LowerBound(Result::new(bcx, C_uint(ccx, prefix + suffix)))
315 }
316 }
317 }
85aaf69f
SL
318
319 fn debug_loc(&self) -> DebugLoc {
320 match *self {
321 ConstantValue(_,debug_loc) |
322 ConstantRange(_, _, debug_loc) |
323 Variant(_, _, _, debug_loc) |
324 SliceLengthEqual(_, debug_loc) |
325 SliceLengthGreaterOrEqual(_, _, debug_loc) => debug_loc
326 }
327 }
1a4d82fc
JJ
328}
329
c34b1796 330#[derive(Copy, Clone, PartialEq)]
1a4d82fc
JJ
331pub enum BranchKind {
332 NoBranch,
333 Single,
334 Switch,
335 Compare,
336 CompareSliceLength
337}
338
339pub enum OptResult<'blk, 'tcx: 'blk> {
340 SingleResult(Result<'blk, 'tcx>),
341 RangeResult(Result<'blk, 'tcx>, Result<'blk, 'tcx>),
342 LowerBound(Result<'blk, 'tcx>)
343}
344
c34b1796 345#[derive(Clone, Copy, PartialEq)]
1a4d82fc 346pub enum TransBindingMode {
c1a9b12d
SL
347 /// By-value binding for a copy type: copies from matched data
348 /// into a fresh LLVM alloca.
1a4d82fc 349 TrByCopy(/* llbinding */ ValueRef),
c1a9b12d
SL
350
351 /// By-value binding for a non-copy type where we copy into a
352 /// fresh LLVM alloca; this most accurately reflects the language
353 /// semantics (e.g. it properly handles overwrites of the matched
354 /// input), but potentially injects an unwanted copy.
355 TrByMoveIntoCopy(/* llbinding */ ValueRef),
356
357 /// Binding a non-copy type by reference under the hood; this is
358 /// a codegen optimization to avoid unnecessary memory traffic.
359 TrByMoveRef,
360
361 /// By-ref binding exposed in the original source input.
1a4d82fc
JJ
362 TrByRef,
363}
364
c1a9b12d
SL
365impl TransBindingMode {
366 /// if binding by making a fresh copy; returns the alloca that it
367 /// will copy into; otherwise None.
368 fn alloca_if_copy(&self) -> Option<ValueRef> {
369 match *self {
370 TrByCopy(llbinding) | TrByMoveIntoCopy(llbinding) => Some(llbinding),
371 TrByMoveRef | TrByRef => None,
372 }
373 }
374}
375
1a4d82fc
JJ
376/// Information about a pattern binding:
377/// - `llmatch` is a pointer to a stack slot. The stack slot contains a
378/// pointer into the value being matched. Hence, llmatch has type `T**`
379/// where `T` is the value being matched.
380/// - `trmode` is the trans binding mode
381/// - `id` is the node id of the binding
382/// - `ty` is the Rust type of the binding
383#[derive(Clone, Copy)]
384pub struct BindingInfo<'tcx> {
385 pub llmatch: ValueRef,
386 pub trmode: TransBindingMode,
387 pub id: ast::NodeId,
388 pub span: Span,
389 pub ty: Ty<'tcx>,
390}
391
b039eaaf 392type BindingsMap<'tcx> = FnvHashMap<ast::Name, BindingInfo<'tcx>>;
1a4d82fc
JJ
393
394struct ArmData<'p, 'blk, 'tcx: 'blk> {
395 bodycx: Block<'blk, 'tcx>,
e9174d1e 396 arm: &'p hir::Arm,
1a4d82fc
JJ
397 bindings_map: BindingsMap<'tcx>
398}
399
400/// Info about Match.
401/// If all `pats` are matched then arm `data` will be executed.
402/// As we proceed `bound_ptrs` are filled with pointers to values to be bound,
403/// these pointers are stored in llmatch variables just before executing `data` arm.
404struct Match<'a, 'p: 'a, 'blk: 'a, 'tcx: 'blk> {
e9174d1e 405 pats: Vec<&'p hir::Pat>,
1a4d82fc 406 data: &'a ArmData<'p, 'blk, 'tcx>,
b039eaaf 407 bound_ptrs: Vec<(ast::Name, ValueRef)>,
85aaf69f
SL
408 // Thread along renamings done by the check_match::StaticInliner, so we can
409 // map back to original NodeIds
410 pat_renaming_map: Option<&'a FnvHashMap<(NodeId, Span), NodeId>>
1a4d82fc
JJ
411}
412
62682a34
SL
413impl<'a, 'p, 'blk, 'tcx> fmt::Debug for Match<'a, 'p, 'blk, 'tcx> {
414 fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
415 if ppaux::verbose() {
1a4d82fc 416 // for many programs, this just take too long to serialize
62682a34 417 write!(f, "{:?}", self.pats)
1a4d82fc 418 } else {
62682a34 419 write!(f, "{} pats", self.pats.len())
1a4d82fc
JJ
420 }
421 }
422}
423
c34b1796 424fn has_nested_bindings(m: &[Match], col: usize) -> bool {
85aaf69f 425 for br in m {
1a4d82fc 426 match br.pats[col].node {
7453a54e 427 PatKind::Ident(_, _, Some(_)) => return true,
1a4d82fc
JJ
428 _ => ()
429 }
430 }
431 return false;
432}
433
c1a9b12d
SL
434// As noted in `fn match_datum`, we should eventually pass around a
435// `Datum<Lvalue>` for the `val`; but until we get to that point, this
436// `MatchInput` struct will serve -- it has everything `Datum<Lvalue>`
437// does except for the type field.
438#[derive(Copy, Clone)]
439pub struct MatchInput { val: ValueRef, lval: Lvalue }
440
441impl<'tcx> Datum<'tcx, Lvalue> {
442 pub fn match_input(&self) -> MatchInput {
443 MatchInput {
444 val: self.val,
445 lval: self.kind,
446 }
447 }
448}
449
450impl MatchInput {
451 fn from_val(val: ValueRef) -> MatchInput {
452 MatchInput {
453 val: val,
454 lval: Lvalue::new("MatchInput::from_val"),
455 }
456 }
457
458 fn to_datum<'tcx>(self, ty: Ty<'tcx>) -> Datum<'tcx, Lvalue> {
459 Datum::new(self.val, ty, self.lval)
460 }
461}
462
1a4d82fc
JJ
463fn expand_nested_bindings<'a, 'p, 'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
464 m: &[Match<'a, 'p, 'blk, 'tcx>],
c34b1796 465 col: usize,
c1a9b12d 466 val: MatchInput)
1a4d82fc 467 -> Vec<Match<'a, 'p, 'blk, 'tcx>> {
62682a34 468 debug!("expand_nested_bindings(bcx={}, m={:?}, col={}, val={})",
1a4d82fc 469 bcx.to_str(),
62682a34 470 m,
1a4d82fc 471 col,
c1a9b12d 472 bcx.val_to_string(val.val));
1a4d82fc
JJ
473 let _indenter = indenter();
474
475 m.iter().map(|br| {
476 let mut bound_ptrs = br.bound_ptrs.clone();
477 let mut pat = br.pats[col];
478 loop {
479 pat = match pat.node {
7453a54e 480 PatKind::Ident(_, ref path, Some(ref inner)) => {
92a42be0 481 bound_ptrs.push((path.node.name, val.val));
7453a54e 482 &inner
1a4d82fc
JJ
483 },
484 _ => break
485 }
486 }
487
488 let mut pats = br.pats.clone();
489 pats[col] = pat;
490 Match {
491 pats: pats,
7453a54e 492 data: &br.data,
85aaf69f
SL
493 bound_ptrs: bound_ptrs,
494 pat_renaming_map: br.pat_renaming_map,
1a4d82fc
JJ
495 }
496 }).collect()
497}
498
499fn enter_match<'a, 'b, 'p, 'blk, 'tcx, F>(bcx: Block<'blk, 'tcx>,
92a42be0 500 dm: &RefCell<DefMap>,
1a4d82fc 501 m: &[Match<'a, 'p, 'blk, 'tcx>],
c34b1796 502 col: usize,
c1a9b12d 503 val: MatchInput,
1a4d82fc
JJ
504 mut e: F)
505 -> Vec<Match<'a, 'p, 'blk, 'tcx>> where
e9174d1e 506 F: FnMut(&[&'p hir::Pat]) -> Option<Vec<&'p hir::Pat>>,
1a4d82fc 507{
62682a34 508 debug!("enter_match(bcx={}, m={:?}, col={}, val={})",
1a4d82fc 509 bcx.to_str(),
62682a34 510 m,
1a4d82fc 511 col,
c1a9b12d 512 bcx.val_to_string(val.val));
1a4d82fc
JJ
513 let _indenter = indenter();
514
515 m.iter().filter_map(|br| {
c34b1796 516 e(&br.pats).map(|pats| {
1a4d82fc
JJ
517 let this = br.pats[col];
518 let mut bound_ptrs = br.bound_ptrs.clone();
519 match this.node {
7453a54e
SL
520 PatKind::Ident(_, ref path, None) => {
521 if pat_is_binding(&dm.borrow(), &this) {
92a42be0 522 bound_ptrs.push((path.node.name, val.val));
1a4d82fc
JJ
523 }
524 }
7453a54e
SL
525 PatKind::Vec(ref before, Some(ref slice), ref after) => {
526 if let PatKind::Ident(_, ref path, None) = slice.node {
1a4d82fc
JJ
527 let subslice_val = bind_subslice_pat(
528 bcx, this.id, val,
529 before.len(), after.len());
92a42be0 530 bound_ptrs.push((path.node.name, subslice_val));
1a4d82fc
JJ
531 }
532 }
533 _ => {}
534 }
535 Match {
536 pats: pats,
537 data: br.data,
85aaf69f
SL
538 bound_ptrs: bound_ptrs,
539 pat_renaming_map: br.pat_renaming_map,
1a4d82fc
JJ
540 }
541 })
542 }).collect()
543}
544
545fn enter_default<'a, 'p, 'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
92a42be0 546 dm: &RefCell<DefMap>,
1a4d82fc 547 m: &[Match<'a, 'p, 'blk, 'tcx>],
c34b1796 548 col: usize,
c1a9b12d 549 val: MatchInput)
1a4d82fc 550 -> Vec<Match<'a, 'p, 'blk, 'tcx>> {
62682a34 551 debug!("enter_default(bcx={}, m={:?}, col={}, val={})",
1a4d82fc 552 bcx.to_str(),
62682a34 553 m,
1a4d82fc 554 col,
c1a9b12d 555 bcx.val_to_string(val.val));
1a4d82fc
JJ
556 let _indenter = indenter();
557
558 // Collect all of the matches that can match against anything.
559 enter_match(bcx, dm, m, col, val, |pats| {
7453a54e 560 if pat_is_binding_or_wild(&dm.borrow(), &pats[col]) {
85aaf69f 561 let mut r = pats[..col].to_vec();
92a42be0 562 r.extend_from_slice(&pats[col + 1..]);
1a4d82fc
JJ
563 Some(r)
564 } else {
565 None
566 }
567 })
568}
569
570// <pcwalton> nmatsakis: what does enter_opt do?
571// <pcwalton> in trans/match
572// <pcwalton> trans/match.rs is like stumbling around in a dark cave
573// <nmatsakis> pcwalton: the enter family of functions adjust the set of
574// patterns as needed
575// <nmatsakis> yeah, at some point I kind of achieved some level of
576// understanding
577// <nmatsakis> anyhow, they adjust the patterns given that something of that
578// kind has been found
579// <nmatsakis> pcwalton: ok, right, so enter_XXX() adjusts the patterns, as I
580// said
581// <nmatsakis> enter_match() kind of embodies the generic code
582// <nmatsakis> it is provided with a function that tests each pattern to see
583// if it might possibly apply and so forth
584// <nmatsakis> so, if you have a pattern like {a: _, b: _, _} and one like _
585// <nmatsakis> then _ would be expanded to (_, _)
586// <nmatsakis> one spot for each of the sub-patterns
587// <nmatsakis> enter_opt() is one of the more complex; it covers the fallible
588// cases
589// <nmatsakis> enter_rec_or_struct() or enter_tuple() are simpler, since they
590// are infallible patterns
591// <nmatsakis> so all patterns must either be records (resp. tuples) or
592// wildcards
593
594/// The above is now outdated in that enter_match() now takes a function that
595/// takes the complete row of patterns rather than just the first one.
596/// Also, most of the enter_() family functions have been unified with
597/// the check_match specialization step.
598fn enter_opt<'a, 'p, 'blk, 'tcx>(
599 bcx: Block<'blk, 'tcx>,
600 _: ast::NodeId,
92a42be0 601 dm: &RefCell<DefMap>,
1a4d82fc
JJ
602 m: &[Match<'a, 'p, 'blk, 'tcx>],
603 opt: &Opt,
c34b1796
AL
604 col: usize,
605 variant_size: usize,
c1a9b12d 606 val: MatchInput)
1a4d82fc 607 -> Vec<Match<'a, 'p, 'blk, 'tcx>> {
62682a34 608 debug!("enter_opt(bcx={}, m={:?}, opt={:?}, col={}, val={})",
1a4d82fc 609 bcx.to_str(),
62682a34 610 m,
1a4d82fc
JJ
611 *opt,
612 col,
c1a9b12d 613 bcx.val_to_string(val.val));
1a4d82fc
JJ
614 let _indenter = indenter();
615
616 let ctor = match opt {
85aaf69f 617 &ConstantValue(ConstantExpr(expr), _) => check_match::ConstantValue(
7453a54e 618 const_eval::eval_const_expr(bcx.tcx(), &expr)
1a4d82fc 619 ),
85aaf69f 620 &ConstantRange(ConstantExpr(lo), ConstantExpr(hi), _) => check_match::ConstantRange(
7453a54e
SL
621 const_eval::eval_const_expr(bcx.tcx(), &lo),
622 const_eval::eval_const_expr(bcx.tcx(), &hi)
1a4d82fc 623 ),
85aaf69f 624 &SliceLengthEqual(n, _) =>
1a4d82fc 625 check_match::Slice(n),
85aaf69f 626 &SliceLengthGreaterOrEqual(before, after, _) =>
1a4d82fc 627 check_match::SliceWithSubslice(before, after),
85aaf69f 628 &Variant(_, _, def_id, _) =>
1a4d82fc
JJ
629 check_match::Constructor::Variant(def_id)
630 };
631
c1a9b12d 632 let param_env = bcx.tcx().empty_parameter_environment();
1a4d82fc
JJ
633 let mcx = check_match::MatchCheckCtxt {
634 tcx: bcx.tcx(),
635 param_env: param_env,
636 };
637 enter_match(bcx, dm, m, col, val, |pats|
85aaf69f 638 check_match::specialize(&mcx, &pats[..], &ctor, col, variant_size)
1a4d82fc
JJ
639 )
640}
641
642// Returns the options in one column of matches. An option is something that
643// needs to be conditionally matched at runtime; for example, the discriminant
644// on a set of enum variants or a literal.
645fn get_branches<'a, 'p, 'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
85aaf69f 646 m: &[Match<'a, 'p, 'blk, 'tcx>],
c34b1796 647 col: usize)
1a4d82fc
JJ
648 -> Vec<Opt<'p, 'tcx>> {
649 let tcx = bcx.tcx();
650
651 let mut found: Vec<Opt> = vec![];
85aaf69f 652 for br in m {
1a4d82fc 653 let cur = br.pats[col];
85aaf69f
SL
654 let debug_loc = match br.pat_renaming_map {
655 Some(pat_renaming_map) => {
656 match pat_renaming_map.get(&(cur.id, cur.span)) {
657 Some(&id) => DebugLoc::At(id, cur.span),
658 None => DebugLoc::At(cur.id, cur.span),
659 }
660 }
661 None => DebugLoc::None
662 };
663
1a4d82fc 664 let opt = match cur.node {
7453a54e
SL
665 PatKind::Lit(ref l) => {
666 ConstantValue(ConstantExpr(&l), debug_loc)
85aaf69f 667 }
7453a54e
SL
668 PatKind::Ident(..) | PatKind::Path(..) |
669 PatKind::TupleStruct(..) | PatKind::Struct(..) => {
1a4d82fc 670 // This is either an enum variant or a variable binding.
c34b1796 671 let opt_def = tcx.def_map.borrow().get(&cur.id).map(|d| d.full_def());
1a4d82fc 672 match opt_def {
7453a54e 673 Some(Def::Variant(enum_id, var_id)) => {
e9174d1e 674 let variant = tcx.lookup_adt_def(enum_id).variant_with_id(var_id);
9cc50fc6 675 Variant(Disr::from(variant.disr_val),
85aaf69f
SL
676 adt::represent_node(bcx, cur.id),
677 var_id,
678 debug_loc)
1a4d82fc
JJ
679 }
680 _ => continue
681 }
682 }
7453a54e
SL
683 PatKind::Range(ref l1, ref l2) => {
684 ConstantRange(ConstantExpr(&l1), ConstantExpr(&l2), debug_loc)
1a4d82fc 685 }
7453a54e 686 PatKind::Vec(ref before, None, ref after) => {
85aaf69f 687 SliceLengthEqual(before.len() + after.len(), debug_loc)
1a4d82fc 688 }
7453a54e 689 PatKind::Vec(ref before, Some(_), ref after) => {
85aaf69f 690 SliceLengthGreaterOrEqual(before.len(), after.len(), debug_loc)
1a4d82fc
JJ
691 }
692 _ => continue
693 };
694
695 if !found.iter().any(|x| x.eq(&opt, tcx)) {
696 found.push(opt);
697 }
698 }
699 found
700}
701
702struct ExtractedBlock<'blk, 'tcx: 'blk> {
703 vals: Vec<ValueRef>,
704 bcx: Block<'blk, 'tcx>,
705}
706
707fn extract_variant_args<'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
708 repr: &adt::Repr<'tcx>,
9cc50fc6 709 disr_val: Disr,
c1a9b12d 710 val: MatchInput)
1a4d82fc
JJ
711 -> ExtractedBlock<'blk, 'tcx> {
712 let _icx = push_ctxt("match::extract_variant_args");
92a42be0
SL
713 // Assume enums are always sized for now.
714 let val = adt::MaybeSizedValue::sized(val.val);
85aaf69f 715 let args = (0..adt::num_args(repr, disr_val)).map(|i| {
92a42be0 716 adt::trans_field_ptr(bcx, repr, val, disr_val, i)
1a4d82fc
JJ
717 }).collect();
718
719 ExtractedBlock { vals: args, bcx: bcx }
720}
721
722/// Helper for converting from the ValueRef that we pass around in the match code, which is always
723/// an lvalue, into a Datum. Eventually we should just pass around a Datum and be done with it.
c1a9b12d
SL
724fn match_datum<'tcx>(val: MatchInput, left_ty: Ty<'tcx>) -> Datum<'tcx, Lvalue> {
725 val.to_datum(left_ty)
1a4d82fc
JJ
726}
727
728fn bind_subslice_pat(bcx: Block,
729 pat_id: ast::NodeId,
c1a9b12d 730 val: MatchInput,
c34b1796
AL
731 offset_left: usize,
732 offset_right: usize) -> ValueRef {
1a4d82fc
JJ
733 let _icx = push_ctxt("match::bind_subslice_pat");
734 let vec_ty = node_id_type(bcx, pat_id);
c1a9b12d
SL
735 let vec_ty_contents = match vec_ty.sty {
736 ty::TyBox(ty) => ty,
737 ty::TyRef(_, mt) | ty::TyRawPtr(mt) => mt.ty,
738 _ => vec_ty
739 };
740 let unit_ty = vec_ty_contents.sequence_element_type(bcx.tcx());
1a4d82fc
JJ
741 let vec_datum = match_datum(val, vec_ty);
742 let (base, len) = vec_datum.get_vec_base_and_len(bcx);
743
85aaf69f 744 let slice_begin = InBoundsGEP(bcx, base, &[C_uint(bcx.ccx(), offset_left)]);
1a4d82fc 745 let slice_len_offset = C_uint(bcx.ccx(), offset_left + offset_right);
85aaf69f 746 let slice_len = Sub(bcx, len, slice_len_offset, DebugLoc::None);
c1a9b12d
SL
747 let slice_ty = bcx.tcx().mk_imm_ref(bcx.tcx().mk_region(ty::ReStatic),
748 bcx.tcx().mk_slice(unit_ty));
1a4d82fc 749 let scratch = rvalue_scratch_datum(bcx, slice_ty, "");
e9174d1e
SL
750 Store(bcx, slice_begin, expr::get_dataptr(bcx, scratch.val));
751 Store(bcx, slice_len, expr::get_meta(bcx, scratch.val));
1a4d82fc
JJ
752 scratch.val
753}
754
755fn extract_vec_elems<'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
c34b1796
AL
756 left_ty: Ty<'tcx>,
757 before: usize,
758 after: usize,
c1a9b12d 759 val: MatchInput)
1a4d82fc
JJ
760 -> ExtractedBlock<'blk, 'tcx> {
761 let _icx = push_ctxt("match::extract_vec_elems");
762 let vec_datum = match_datum(val, left_ty);
763 let (base, len) = vec_datum.get_vec_base_and_len(bcx);
764 let mut elems = vec![];
85aaf69f
SL
765 elems.extend((0..before).map(|i| GEPi(bcx, base, &[i])));
766 elems.extend((0..after).rev().map(|i| {
1a4d82fc 767 InBoundsGEP(bcx, base, &[
85aaf69f 768 Sub(bcx, len, C_uint(bcx.ccx(), i + 1), DebugLoc::None)
1a4d82fc
JJ
769 ])
770 }));
771 ExtractedBlock { vals: elems, bcx: bcx }
772}
773
774// Macro for deciding whether any of the remaining matches fit a given kind of
775// pattern. Note that, because the macro is well-typed, either ALL of the
776// matches should fit that sort of pattern or NONE (however, some of the
777// matches may be wildcards like _ or identifiers).
778macro_rules! any_pat {
779 ($m:expr, $col:expr, $pattern:pat) => (
780 ($m).iter().any(|br| {
781 match br.pats[$col].node {
782 $pattern => true,
783 _ => false
784 }
785 })
786 )
787}
788
c34b1796 789fn any_uniq_pat(m: &[Match], col: usize) -> bool {
7453a54e 790 any_pat!(m, col, PatKind::Box(_))
1a4d82fc
JJ
791}
792
c34b1796 793fn any_region_pat(m: &[Match], col: usize) -> bool {
7453a54e 794 any_pat!(m, col, PatKind::Ref(..))
1a4d82fc
JJ
795}
796
c34b1796 797fn any_irrefutable_adt_pat(tcx: &ty::ctxt, m: &[Match], col: usize) -> bool {
1a4d82fc
JJ
798 m.iter().any(|br| {
799 let pat = br.pats[col];
800 match pat.node {
7453a54e
SL
801 PatKind::Tup(_) => true,
802 PatKind::Struct(..) | PatKind::TupleStruct(..) |
803 PatKind::Path(..) | PatKind::Ident(_, _, None) => {
804 match tcx.def_map.borrow().get(&pat.id).unwrap().full_def() {
805 Def::Struct(..) | Def::TyAlias(..) => true,
806 _ => false,
1a4d82fc
JJ
807 }
808 }
809 _ => false
810 }
811 })
812}
813
814/// What to do when the pattern match fails.
815enum FailureHandler {
816 Infallible,
817 JumpToBasicBlock(BasicBlockRef),
818 Unreachable
819}
820
821impl FailureHandler {
822 fn is_fallible(&self) -> bool {
823 match *self {
824 Infallible => false,
825 _ => true
826 }
827 }
828
829 fn is_infallible(&self) -> bool {
830 !self.is_fallible()
831 }
832
833 fn handle_fail(&self, bcx: Block) {
834 match *self {
835 Infallible =>
836 panic!("attempted to panic in a non-panicking panic handler!"),
837 JumpToBasicBlock(basic_block) =>
85aaf69f 838 Br(bcx, basic_block, DebugLoc::None),
1a4d82fc
JJ
839 Unreachable =>
840 build::Unreachable(bcx)
841 }
842 }
843}
844
92a42be0
SL
845fn pick_column_to_specialize(def_map: &RefCell<DefMap>, m: &[Match]) -> Option<usize> {
846 fn pat_score(def_map: &RefCell<DefMap>, pat: &hir::Pat) -> usize {
1a4d82fc 847 match pat.node {
7453a54e 848 PatKind::Ident(_, _, Some(ref inner)) => pat_score(def_map, &inner),
92a42be0 849 _ if pat_is_refutable(&def_map.borrow(), pat) => 1,
85aaf69f 850 _ => 0
1a4d82fc
JJ
851 }
852 }
853
c34b1796 854 let column_score = |m: &[Match], col: usize| -> usize {
1a4d82fc
JJ
855 let total_score = m.iter()
856 .map(|row| row.pats[col])
857 .map(|pat| pat_score(def_map, pat))
858 .sum();
859
860 // Irrefutable columns always go first, they'd only be duplicated in the branches.
861 if total_score == 0 {
c34b1796 862 std::usize::MAX
1a4d82fc
JJ
863 } else {
864 total_score
865 }
866 };
867
c34b1796 868 let column_contains_any_nonwild_patterns = |&col: &usize| -> bool {
1a4d82fc 869 m.iter().any(|row| match row.pats[col].node {
7453a54e 870 PatKind::Wild => false,
1a4d82fc
JJ
871 _ => true
872 })
873 };
874
85aaf69f 875 (0..m[0].pats.len())
1a4d82fc
JJ
876 .filter(column_contains_any_nonwild_patterns)
877 .map(|col| (col, column_score(m, col)))
92a42be0 878 .max_by_key(|&(_, score)| score)
1a4d82fc
JJ
879 .map(|(col, _)| col)
880}
881
882// Compiles a comparison between two things.
883fn compare_values<'blk, 'tcx>(cx: Block<'blk, 'tcx>,
884 lhs: ValueRef,
885 rhs: ValueRef,
85aaf69f
SL
886 rhs_t: Ty<'tcx>,
887 debug_loc: DebugLoc)
1a4d82fc
JJ
888 -> Result<'blk, 'tcx> {
889 fn compare_str<'blk, 'tcx>(cx: Block<'blk, 'tcx>,
e9174d1e
SL
890 lhs_data: ValueRef,
891 lhs_len: ValueRef,
892 rhs_data: ValueRef,
893 rhs_len: ValueRef,
85aaf69f
SL
894 rhs_t: Ty<'tcx>,
895 debug_loc: DebugLoc)
1a4d82fc
JJ
896 -> Result<'blk, 'tcx> {
897 let did = langcall(cx,
898 None,
62682a34 899 &format!("comparison of `{}`", rhs_t),
1a4d82fc 900 StrEqFnLangItem);
62682a34 901 callee::trans_lang_call(cx, did, &[lhs_data, lhs_len, rhs_data, rhs_len], None, debug_loc)
1a4d82fc
JJ
902 }
903
904 let _icx = push_ctxt("compare_values");
c1a9b12d 905 if rhs_t.is_scalar() {
e9174d1e 906 let cmp = compare_scalar_types(cx, lhs, rhs, rhs_t, hir::BiEq, debug_loc);
85aaf69f 907 return Result::new(cx, cmp);
1a4d82fc
JJ
908 }
909
910 match rhs_t.sty {
62682a34 911 ty::TyRef(_, mt) => match mt.ty.sty {
e9174d1e
SL
912 ty::TyStr => {
913 let lhs_data = Load(cx, expr::get_dataptr(cx, lhs));
914 let lhs_len = Load(cx, expr::get_meta(cx, lhs));
915 let rhs_data = Load(cx, expr::get_dataptr(cx, rhs));
916 let rhs_len = Load(cx, expr::get_meta(cx, rhs));
917 compare_str(cx, lhs_data, lhs_len, rhs_data, rhs_len, rhs_t, debug_loc)
918 }
62682a34 919 ty::TyArray(ty, _) | ty::TySlice(ty) => match ty.sty {
7453a54e 920 ty::TyUint(ast::UintTy::U8) => {
c34b1796 921 // NOTE: cast &[u8] and &[u8; N] to &str and abuse the str_eq lang item,
1a4d82fc 922 // which calls memcmp().
c34b1796 923 let pat_len = val_ty(rhs).element_type().array_length();
c1a9b12d 924 let ty_str_slice = cx.tcx().mk_static_str();
c34b1796 925
e9174d1e
SL
926 let rhs_data = GEPi(cx, rhs, &[0, 0]);
927 let rhs_len = C_uint(cx.ccx(), pat_len);
c34b1796 928
e9174d1e
SL
929 let lhs_data;
930 let lhs_len;
c34b1796
AL
931 if val_ty(lhs) == val_ty(rhs) {
932 // Both the discriminant and the pattern are thin pointers
e9174d1e
SL
933 lhs_data = GEPi(cx, lhs, &[0, 0]);
934 lhs_len = C_uint(cx.ccx(), pat_len);
935 } else {
c34b1796
AL
936 // The discriminant is a fat pointer
937 let llty_str_slice = type_of::type_of(cx.ccx(), ty_str_slice).ptr_to();
e9174d1e
SL
938 let lhs_str = PointerCast(cx, lhs, llty_str_slice);
939 lhs_data = Load(cx, expr::get_dataptr(cx, lhs_str));
940 lhs_len = Load(cx, expr::get_meta(cx, lhs_str));
c34b1796
AL
941 }
942
e9174d1e 943 compare_str(cx, lhs_data, lhs_len, rhs_data, rhs_len, rhs_t, debug_loc)
1a4d82fc
JJ
944 },
945 _ => cx.sess().bug("only byte strings supported in compare_values"),
946 },
947 _ => cx.sess().bug("only string and byte strings supported in compare_values"),
948 },
949 _ => cx.sess().bug("only scalars, byte strings, and strings supported in compare_values"),
950 }
951}
952
953/// For each binding in `data.bindings_map`, adds an appropriate entry into the `fcx.lllocals` map
954fn insert_lllocals<'blk, 'tcx>(mut bcx: Block<'blk, 'tcx>,
955 bindings_map: &BindingsMap<'tcx>,
956 cs: Option<cleanup::ScopeId>)
957 -> Block<'blk, 'tcx> {
b039eaaf 958 for (&name, &binding_info) in bindings_map {
c1a9b12d 959 let (llval, aliases_other_state) = match binding_info.trmode {
1a4d82fc
JJ
960 // By value mut binding for a copy type: load from the ptr
961 // into the matched value and copy to our alloca
c1a9b12d
SL
962 TrByCopy(llbinding) |
963 TrByMoveIntoCopy(llbinding) => {
1a4d82fc 964 let llval = Load(bcx, binding_info.llmatch);
c1a9b12d
SL
965 let lvalue = match binding_info.trmode {
966 TrByCopy(..) =>
967 Lvalue::new("_match::insert_lllocals"),
968 TrByMoveIntoCopy(..) => {
969 // match_input moves from the input into a
970 // separate stack slot.
971 //
972 // E.g. consider moving the value `D(A)` out
973 // of the tuple `(D(A), D(B))` and into the
974 // local variable `x` via the pattern `(x,_)`,
975 // leaving the remainder of the tuple `(_,
976 // D(B))` still to be dropped in the future.
977 //
b039eaaf
SL
978 // Thus, here we must zero the place that we
979 // are moving *from*, because we do not yet
c1a9b12d
SL
980 // track drop flags for a fragmented parent
981 // match input expression.
982 //
983 // Longer term we will be able to map the move
984 // into `(x, _)` up to the parent path that
985 // owns the whole tuple, and mark the
986 // corresponding stack-local drop-flag
987 // tracking the first component of the tuple.
988 let hint_kind = HintKind::ZeroAndMaintain;
989 Lvalue::new_with_hint("_match::insert_lllocals (match_input)",
990 bcx, binding_info.id, hint_kind)
991 }
992 _ => unreachable!(),
993 };
994 let datum = Datum::new(llval, binding_info.ty, lvalue);
1a4d82fc
JJ
995 call_lifetime_start(bcx, llbinding);
996 bcx = datum.store_to(bcx, llbinding);
997 if let Some(cs) = cs {
998 bcx.fcx.schedule_lifetime_end(cs, llbinding);
999 }
1000
c1a9b12d 1001 (llbinding, false)
1a4d82fc
JJ
1002 },
1003
1004 // By value move bindings: load from the ptr into the matched value
c1a9b12d 1005 TrByMoveRef => (Load(bcx, binding_info.llmatch), true),
1a4d82fc
JJ
1006
1007 // By ref binding: use the ptr into the matched value
c1a9b12d 1008 TrByRef => (binding_info.llmatch, true),
1a4d82fc
JJ
1009 };
1010
c1a9b12d
SL
1011
1012 // A local that aliases some other state must be zeroed, since
1013 // the other state (e.g. some parent data that we matched
1014 // into) will still have its subcomponents (such as this
1015 // local) destructed at the end of the parent's scope. Longer
1016 // term, we will properly map such parents to the set of
1017 // unique drop flags for its fragments.
1018 let hint_kind = if aliases_other_state {
1019 HintKind::ZeroAndMaintain
1020 } else {
1021 HintKind::DontZeroJustUse
1022 };
1023 let lvalue = Lvalue::new_with_hint("_match::insert_lllocals (local)",
1024 bcx,
1025 binding_info.id,
1026 hint_kind);
1027 let datum = Datum::new(llval, binding_info.ty, lvalue);
1a4d82fc 1028 if let Some(cs) = cs {
c1a9b12d 1029 let opt_datum = lvalue.dropflag_hint(bcx);
1a4d82fc 1030 bcx.fcx.schedule_lifetime_end(cs, binding_info.llmatch);
c1a9b12d 1031 bcx.fcx.schedule_drop_and_fill_mem(cs, llval, binding_info.ty, opt_datum);
1a4d82fc
JJ
1032 }
1033
85aaf69f 1034 debug!("binding {} to {}", binding_info.id, bcx.val_to_string(llval));
1a4d82fc 1035 bcx.fcx.lllocals.borrow_mut().insert(binding_info.id, datum);
b039eaaf 1036 debuginfo::create_match_binding_metadata(bcx, name, binding_info);
1a4d82fc
JJ
1037 }
1038 bcx
1039}
1040
1041fn compile_guard<'a, 'p, 'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
e9174d1e 1042 guard_expr: &hir::Expr,
1a4d82fc
JJ
1043 data: &ArmData<'p, 'blk, 'tcx>,
1044 m: &[Match<'a, 'p, 'blk, 'tcx>],
c1a9b12d 1045 vals: &[MatchInput],
1a4d82fc
JJ
1046 chk: &FailureHandler,
1047 has_genuine_default: bool)
1048 -> Block<'blk, 'tcx> {
62682a34 1049 debug!("compile_guard(bcx={}, guard_expr={:?}, m={:?}, vals=[{}])",
1a4d82fc 1050 bcx.to_str(),
62682a34
SL
1051 guard_expr,
1052 m,
c1a9b12d 1053 vals.iter().map(|v| bcx.val_to_string(v.val)).collect::<Vec<_>>().join(", "));
1a4d82fc
JJ
1054 let _indenter = indenter();
1055
1056 let mut bcx = insert_lllocals(bcx, &data.bindings_map, None);
1057
1058 let val = unpack_datum!(bcx, expr::trans(bcx, guard_expr));
1059 let val = val.to_llbool(bcx);
1060
85aaf69f 1061 for (_, &binding_info) in &data.bindings_map {
c1a9b12d
SL
1062 if let Some(llbinding) = binding_info.trmode.alloca_if_copy() {
1063 call_lifetime_end(bcx, llbinding)
1a4d82fc
JJ
1064 }
1065 }
1066
85aaf69f
SL
1067 for (_, &binding_info) in &data.bindings_map {
1068 bcx.fcx.lllocals.borrow_mut().remove(&binding_info.id);
1069 }
1070
1071 with_cond(bcx, Not(bcx, val, guard_expr.debug_loc()), |bcx| {
1072 for (_, &binding_info) in &data.bindings_map {
1a4d82fc 1073 call_lifetime_end(bcx, binding_info.llmatch);
1a4d82fc
JJ
1074 }
1075 match chk {
1076 // If the default arm is the only one left, move on to the next
1077 // condition explicitly rather than (possibly) falling back to
1078 // the default arm.
1079 &JumpToBasicBlock(_) if m.len() == 1 && has_genuine_default => {
1080 chk.handle_fail(bcx);
1081 }
1082 _ => {
1083 compile_submatch(bcx, m, vals, chk, has_genuine_default);
1084 }
1085 };
1086 bcx
1087 })
1088}
1089
1090fn compile_submatch<'a, 'p, 'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
1091 m: &[Match<'a, 'p, 'blk, 'tcx>],
c1a9b12d 1092 vals: &[MatchInput],
1a4d82fc
JJ
1093 chk: &FailureHandler,
1094 has_genuine_default: bool) {
62682a34 1095 debug!("compile_submatch(bcx={}, m={:?}, vals=[{}])",
1a4d82fc 1096 bcx.to_str(),
62682a34 1097 m,
c1a9b12d 1098 vals.iter().map(|v| bcx.val_to_string(v.val)).collect::<Vec<_>>().join(", "));
1a4d82fc
JJ
1099 let _indenter = indenter();
1100 let _icx = push_ctxt("match::compile_submatch");
1101 let mut bcx = bcx;
9346a6ac 1102 if m.is_empty() {
1a4d82fc
JJ
1103 if chk.is_fallible() {
1104 chk.handle_fail(bcx);
1105 }
1106 return;
1107 }
1108
1109 let tcx = bcx.tcx();
1110 let def_map = &tcx.def_map;
1111 match pick_column_to_specialize(def_map, m) {
1112 Some(col) => {
1113 let val = vals[col];
1114 if has_nested_bindings(m, col) {
1115 let expanded = expand_nested_bindings(bcx, m, col, val);
1116 compile_submatch_continue(bcx,
85aaf69f 1117 &expanded[..],
1a4d82fc
JJ
1118 vals,
1119 chk,
1120 col,
1121 val,
1122 has_genuine_default)
1123 } else {
1124 compile_submatch_continue(bcx, m, vals, chk, col, val, has_genuine_default)
1125 }
1126 }
1127 None => {
1128 let data = &m[0].data;
b039eaaf
SL
1129 for &(ref name, ref value_ptr) in &m[0].bound_ptrs {
1130 let binfo = *data.bindings_map.get(name).unwrap();
c34b1796
AL
1131 call_lifetime_start(bcx, binfo.llmatch);
1132 if binfo.trmode == TrByRef && type_is_fat_ptr(bcx.tcx(), binfo.ty) {
1133 expr::copy_fat_ptr(bcx, *value_ptr, binfo.llmatch);
1134 }
1135 else {
1136 Store(bcx, *value_ptr, binfo.llmatch);
1137 }
1a4d82fc
JJ
1138 }
1139 match data.arm.guard {
1140 Some(ref guard_expr) => {
1141 bcx = compile_guard(bcx,
7453a54e 1142 &guard_expr,
1a4d82fc
JJ
1143 m[0].data,
1144 &m[1..m.len()],
1145 vals,
1146 chk,
1147 has_genuine_default);
1148 }
1149 _ => ()
1150 }
85aaf69f 1151 Br(bcx, data.bodycx.llbb, DebugLoc::None);
1a4d82fc
JJ
1152 }
1153 }
1154}
1155
1156fn compile_submatch_continue<'a, 'p, 'blk, 'tcx>(mut bcx: Block<'blk, 'tcx>,
1157 m: &[Match<'a, 'p, 'blk, 'tcx>],
c1a9b12d 1158 vals: &[MatchInput],
1a4d82fc 1159 chk: &FailureHandler,
c34b1796 1160 col: usize,
c1a9b12d 1161 val: MatchInput,
1a4d82fc
JJ
1162 has_genuine_default: bool) {
1163 let fcx = bcx.fcx;
1164 let tcx = bcx.tcx();
1165 let dm = &tcx.def_map;
1166
85aaf69f 1167 let mut vals_left = vals[0..col].to_vec();
92a42be0 1168 vals_left.extend_from_slice(&vals[col + 1..]);
1a4d82fc
JJ
1169 let ccx = bcx.fcx.ccx;
1170
1171 // Find a real id (we're adding placeholder wildcard patterns, but
1172 // each column is guaranteed to have at least one real pattern)
1173 let pat_id = m.iter().map(|br| br.pats[col].id)
1174 .find(|&id| id != DUMMY_NODE_ID)
1175 .unwrap_or(DUMMY_NODE_ID);
1176
1177 let left_ty = if pat_id == DUMMY_NODE_ID {
c1a9b12d 1178 tcx.mk_nil()
1a4d82fc
JJ
1179 } else {
1180 node_id_type(bcx, pat_id)
1181 };
1182
1183 let mcx = check_match::MatchCheckCtxt {
1184 tcx: bcx.tcx(),
c1a9b12d 1185 param_env: bcx.tcx().empty_parameter_environment(),
1a4d82fc
JJ
1186 };
1187 let adt_vals = if any_irrefutable_adt_pat(bcx.tcx(), m, col) {
1188 let repr = adt::represent_type(bcx.ccx(), left_ty);
7453a54e 1189 let arg_count = adt::num_args(&repr, Disr(0));
d9579d0f 1190 let (arg_count, struct_val) = if type_is_sized(bcx.tcx(), left_ty) {
c1a9b12d 1191 (arg_count, val.val)
d9579d0f
AL
1192 } else {
1193 // For an unsized ADT (i.e. DST struct), we need to treat
1194 // the last field specially: instead of simply passing a
1195 // ValueRef pointing to that field, as with all the others,
1196 // we skip it and instead construct a 'fat ptr' below.
c1a9b12d 1197 (arg_count - 1, Load(bcx, expr::get_dataptr(bcx, val.val)))
d9579d0f
AL
1198 };
1199 let mut field_vals: Vec<ValueRef> = (0..arg_count).map(|ix|
92a42be0 1200 // By definition, these are all sized
7453a54e 1201 adt::trans_field_ptr(bcx, &repr, adt::MaybeSizedValue::sized(struct_val), Disr(0), ix)
1a4d82fc 1202 ).collect();
d9579d0f
AL
1203
1204 match left_ty.sty {
e9174d1e 1205 ty::TyStruct(def, substs) if !type_is_sized(bcx.tcx(), left_ty) => {
d9579d0f
AL
1206 // The last field is technically unsized but
1207 // since we can only ever match that field behind
1208 // a reference we construct a fat ptr here.
e9174d1e
SL
1209 let unsized_ty = def.struct_variant().fields.last().map(|field| {
1210 monomorphize::field_ty(bcx.tcx(), substs, field)
d9579d0f 1211 }).unwrap();
e9174d1e 1212 let scratch = alloc_ty(bcx, unsized_ty, "__struct_field_fat_ptr");
92a42be0
SL
1213
1214 let meta = Load(bcx, expr::get_meta(bcx, val.val));
1215 let struct_val = adt::MaybeSizedValue::unsized_(struct_val, meta);
1216
7453a54e 1217 let data = adt::trans_field_ptr(bcx, &repr, struct_val, Disr(0), arg_count);
d9579d0f 1218 Store(bcx, data, expr::get_dataptr(bcx, scratch));
92a42be0 1219 Store(bcx, meta, expr::get_meta(bcx, scratch));
d9579d0f
AL
1220 field_vals.push(scratch);
1221 }
1222 _ => {}
1223 }
1a4d82fc
JJ
1224 Some(field_vals)
1225 } else if any_uniq_pat(m, col) || any_region_pat(m, col) {
c1a9b12d 1226 Some(vec!(Load(bcx, val.val)))
1a4d82fc
JJ
1227 } else {
1228 match left_ty.sty {
62682a34 1229 ty::TyArray(_, n) => {
1a4d82fc
JJ
1230 let args = extract_vec_elems(bcx, left_ty, n, 0, val);
1231 Some(args.vals)
1232 }
1233 _ => None
1234 }
1235 };
1a4d82fc
JJ
1236 match adt_vals {
1237 Some(field_vals) => {
1238 let pats = enter_match(bcx, dm, m, col, val, |pats|
1239 check_match::specialize(&mcx, pats,
1240 &check_match::Single, col,
1241 field_vals.len())
1242 );
c1a9b12d
SL
1243 let mut vals: Vec<_> = field_vals.into_iter()
1244 .map(|v|MatchInput::from_val(v))
1245 .collect();
92a42be0 1246 vals.extend_from_slice(&vals_left);
85aaf69f 1247 compile_submatch(bcx, &pats, &vals, chk, has_genuine_default);
1a4d82fc
JJ
1248 return;
1249 }
1250 _ => ()
1251 }
1252
1253 // Decide what kind of branch we need
1254 let opts = get_branches(bcx, m, col);
1255 debug!("options={:?}", opts);
1256 let mut kind = NoBranch;
c1a9b12d 1257 let mut test_val = val.val;
1a4d82fc 1258 debug!("test_val={}", bcx.val_to_string(test_val));
9346a6ac 1259 if !opts.is_empty() {
1a4d82fc 1260 match opts[0] {
85aaf69f 1261 ConstantValue(..) | ConstantRange(..) => {
c1a9b12d
SL
1262 test_val = load_if_immediate(bcx, val.val, left_ty);
1263 kind = if left_ty.is_integral() {
1a4d82fc
JJ
1264 Switch
1265 } else {
1266 Compare
1267 };
1268 }
85aaf69f 1269 Variant(_, ref repr, _, _) => {
7453a54e
SL
1270 let (the_kind, val_opt) = adt::trans_switch(bcx, &repr,
1271 val.val, true);
1a4d82fc 1272 kind = the_kind;
85aaf69f 1273 if let Some(tval) = val_opt { test_val = tval; }
1a4d82fc 1274 }
85aaf69f 1275 SliceLengthEqual(..) | SliceLengthGreaterOrEqual(..) => {
c1a9b12d 1276 let (_, len) = tvec::get_base_and_len(bcx, val.val, left_ty);
1a4d82fc
JJ
1277 test_val = len;
1278 kind = Switch;
1279 }
1280 }
1281 }
85aaf69f 1282 for o in &opts {
1a4d82fc 1283 match *o {
85aaf69f
SL
1284 ConstantRange(..) => { kind = Compare; break },
1285 SliceLengthGreaterOrEqual(..) => { kind = CompareSliceLength; break },
1a4d82fc
JJ
1286 _ => ()
1287 }
1288 }
1289 let else_cx = match kind {
1290 NoBranch | Single => bcx,
1291 _ => bcx.fcx.new_temp_block("match_else")
1292 };
1293 let sw = if kind == Switch {
1294 build::Switch(bcx, test_val, else_cx.llbb, opts.len())
1295 } else {
85aaf69f 1296 C_int(ccx, 0) // Placeholder for when not using a switch
1a4d82fc
JJ
1297 };
1298
1299 let defaults = enter_default(else_cx, dm, m, col, val);
9346a6ac 1300 let exhaustive = chk.is_infallible() && defaults.is_empty();
1a4d82fc
JJ
1301 let len = opts.len();
1302
b039eaaf
SL
1303 if exhaustive && kind == Switch {
1304 build::Unreachable(else_cx);
1305 }
1306
1a4d82fc
JJ
1307 // Compile subtrees for each option
1308 for (i, opt) in opts.iter().enumerate() {
1309 // In some cases of range and vector pattern matching, we need to
1310 // override the failure case so that instead of failing, it proceeds
1311 // to try more matching. branch_chk, then, is the proper failure case
1312 // for the current conditional branch.
1313 let mut branch_chk = None;
1314 let mut opt_cx = else_cx;
85aaf69f
SL
1315 let debug_loc = opt.debug_loc();
1316
b039eaaf 1317 if kind == Switch || !exhaustive || i + 1 < len {
1a4d82fc
JJ
1318 opt_cx = bcx.fcx.new_temp_block("match_case");
1319 match kind {
85aaf69f 1320 Single => Br(bcx, opt_cx.llbb, debug_loc),
1a4d82fc
JJ
1321 Switch => {
1322 match opt.trans(bcx) {
1323 SingleResult(r) => {
1324 AddCase(sw, r.val, opt_cx.llbb);
1325 bcx = r.bcx;
1326 }
1327 _ => {
1328 bcx.sess().bug(
1329 "in compile_submatch, expected \
1330 opt.trans() to return a SingleResult")
1331 }
1332 }
1333 }
1334 Compare | CompareSliceLength => {
1335 let t = if kind == Compare {
1336 left_ty
1337 } else {
c34b1796 1338 tcx.types.usize // vector length
1a4d82fc
JJ
1339 };
1340 let Result { bcx: after_cx, val: matches } = {
1341 match opt.trans(bcx) {
1342 SingleResult(Result { bcx, val }) => {
85aaf69f 1343 compare_values(bcx, test_val, val, t, debug_loc)
1a4d82fc
JJ
1344 }
1345 RangeResult(Result { val: vbegin, .. },
1346 Result { bcx, val: vend }) => {
85aaf69f 1347 let llge = compare_scalar_types(bcx, test_val, vbegin,
e9174d1e 1348 t, hir::BiGe, debug_loc);
85aaf69f 1349 let llle = compare_scalar_types(bcx, test_val, vend,
e9174d1e 1350 t, hir::BiLe, debug_loc);
85aaf69f 1351 Result::new(bcx, And(bcx, llge, llle, DebugLoc::None))
1a4d82fc
JJ
1352 }
1353 LowerBound(Result { bcx, val }) => {
85aaf69f 1354 Result::new(bcx, compare_scalar_types(bcx, test_val,
e9174d1e 1355 val, t, hir::BiGe,
85aaf69f 1356 debug_loc))
1a4d82fc
JJ
1357 }
1358 }
1359 };
1360 bcx = fcx.new_temp_block("compare_next");
1361
1362 // If none of the sub-cases match, and the current condition
1363 // is guarded or has multiple patterns, move on to the next
1364 // condition, if there is any, rather than falling back to
1365 // the default.
1366 let guarded = m[i].data.arm.guard.is_some();
1367 let multi_pats = m[i].pats.len() > 1;
1368 if i + 1 < len && (guarded || multi_pats || kind == CompareSliceLength) {
1369 branch_chk = Some(JumpToBasicBlock(bcx.llbb));
1370 }
85aaf69f 1371 CondBr(after_cx, matches, opt_cx.llbb, bcx.llbb, debug_loc);
1a4d82fc
JJ
1372 }
1373 _ => ()
1374 }
1375 } else if kind == Compare || kind == CompareSliceLength {
85aaf69f 1376 Br(bcx, else_cx.llbb, debug_loc);
1a4d82fc
JJ
1377 }
1378
85aaf69f 1379 let mut size = 0;
1a4d82fc
JJ
1380 let mut unpacked = Vec::new();
1381 match *opt {
85aaf69f 1382 Variant(disr_val, ref repr, _, _) => {
1a4d82fc 1383 let ExtractedBlock {vals: argvals, bcx: new_bcx} =
7453a54e 1384 extract_variant_args(opt_cx, &repr, disr_val, val);
1a4d82fc
JJ
1385 size = argvals.len();
1386 unpacked = argvals;
1387 opt_cx = new_bcx;
1388 }
85aaf69f 1389 SliceLengthEqual(len, _) => {
1a4d82fc
JJ
1390 let args = extract_vec_elems(opt_cx, left_ty, len, 0, val);
1391 size = args.vals.len();
1392 unpacked = args.vals.clone();
1393 opt_cx = args.bcx;
1394 }
85aaf69f 1395 SliceLengthGreaterOrEqual(before, after, _) => {
1a4d82fc
JJ
1396 let args = extract_vec_elems(opt_cx, left_ty, before, after, val);
1397 size = args.vals.len();
1398 unpacked = args.vals.clone();
1399 opt_cx = args.bcx;
1400 }
85aaf69f 1401 ConstantValue(..) | ConstantRange(..) => ()
1a4d82fc
JJ
1402 }
1403 let opt_ms = enter_opt(opt_cx, pat_id, dm, m, opt, col, size, val);
c1a9b12d
SL
1404 let mut opt_vals: Vec<_> = unpacked.into_iter()
1405 .map(|v|MatchInput::from_val(v))
1406 .collect();
92a42be0 1407 opt_vals.extend_from_slice(&vals_left[..]);
1a4d82fc 1408 compile_submatch(opt_cx,
85aaf69f
SL
1409 &opt_ms[..],
1410 &opt_vals[..],
1a4d82fc
JJ
1411 branch_chk.as_ref().unwrap_or(chk),
1412 has_genuine_default);
1413 }
1414
1415 // Compile the fall-through case, if any
1416 if !exhaustive && kind != Single {
1417 if kind == Compare || kind == CompareSliceLength {
85aaf69f 1418 Br(bcx, else_cx.llbb, DebugLoc::None);
1a4d82fc
JJ
1419 }
1420 match chk {
1421 // If there is only one default arm left, move on to the next
1422 // condition explicitly rather than (eventually) falling back to
1423 // the last default arm.
1424 &JumpToBasicBlock(_) if defaults.len() == 1 && has_genuine_default => {
1425 chk.handle_fail(else_cx);
1426 }
1427 _ => {
1428 compile_submatch(else_cx,
85aaf69f
SL
1429 &defaults[..],
1430 &vals_left[..],
1a4d82fc
JJ
1431 chk,
1432 has_genuine_default);
1433 }
1434 }
1435 }
1436}
1437
1438pub fn trans_match<'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
e9174d1e
SL
1439 match_expr: &hir::Expr,
1440 discr_expr: &hir::Expr,
1441 arms: &[hir::Arm],
1a4d82fc
JJ
1442 dest: Dest)
1443 -> Block<'blk, 'tcx> {
1444 let _icx = push_ctxt("match::trans_match");
1445 trans_match_inner(bcx, match_expr.id, discr_expr, arms, dest)
1446}
1447
1448/// Checks whether the binding in `discr` is assigned to anywhere in the expression `body`
e9174d1e 1449fn is_discr_reassigned(bcx: Block, discr: &hir::Expr, body: &hir::Expr) -> bool {
1a4d82fc 1450 let (vid, field) = match discr.node {
e9174d1e 1451 hir::ExprPath(..) => match bcx.def(discr.id) {
7453a54e 1452 Def::Local(_, vid) | Def::Upvar(_, vid, _, _) => (vid, None),
1a4d82fc
JJ
1453 _ => return false
1454 },
e9174d1e 1455 hir::ExprField(ref base, field) => {
c34b1796 1456 let vid = match bcx.tcx().def_map.borrow().get(&base.id).map(|d| d.full_def()) {
7453a54e 1457 Some(Def::Local(_, vid)) | Some(Def::Upvar(_, vid, _, _)) => vid,
1a4d82fc
JJ
1458 _ => return false
1459 };
b039eaaf 1460 (vid, Some(mc::NamedField(field.node)))
1a4d82fc 1461 },
e9174d1e 1462 hir::ExprTupField(ref base, field) => {
c34b1796 1463 let vid = match bcx.tcx().def_map.borrow().get(&base.id).map(|d| d.full_def()) {
7453a54e 1464 Some(Def::Local(_, vid)) | Some(Def::Upvar(_, vid, _, _)) => vid,
1a4d82fc
JJ
1465 _ => return false
1466 };
1467 (vid, Some(mc::PositionalField(field.node)))
1468 },
1469 _ => return false
1470 };
1471
1472 let mut rc = ReassignmentChecker {
1473 node: vid,
1474 field: field,
1475 reassigned: false
1476 };
1477 {
c1a9b12d
SL
1478 let infcx = infer::normalizing_infer_ctxt(bcx.tcx(), &bcx.tcx().tables);
1479 let mut visitor = euv::ExprUseVisitor::new(&mut rc, &infcx);
1a4d82fc
JJ
1480 visitor.walk_expr(body);
1481 }
1482 rc.reassigned
1483}
1484
1485struct ReassignmentChecker {
1486 node: ast::NodeId,
1487 field: Option<mc::FieldName>,
1488 reassigned: bool
1489}
1490
1491// Determine if the expression we're matching on is reassigned to within
1492// the body of the match's arm.
1493// We only care for the `mutate` callback since this check only matters
1494// for cases where the matched value is moved.
1495impl<'tcx> euv::Delegate<'tcx> for ReassignmentChecker {
1496 fn consume(&mut self, _: ast::NodeId, _: Span, _: mc::cmt, _: euv::ConsumeMode) {}
e9174d1e
SL
1497 fn matched_pat(&mut self, _: &hir::Pat, _: mc::cmt, _: euv::MatchMode) {}
1498 fn consume_pat(&mut self, _: &hir::Pat, _: mc::cmt, _: euv::ConsumeMode) {}
1a4d82fc
JJ
1499 fn borrow(&mut self, _: ast::NodeId, _: Span, _: mc::cmt, _: ty::Region,
1500 _: ty::BorrowKind, _: euv::LoanCause) {}
1501 fn decl_without_init(&mut self, _: ast::NodeId, _: Span) {}
1502
1503 fn mutate(&mut self, _: ast::NodeId, _: Span, cmt: mc::cmt, _: euv::MutateMode) {
1504 match cmt.cat {
92a42be0
SL
1505 Categorization::Upvar(mc::Upvar { id: ty::UpvarId { var_id: vid, .. }, .. }) |
1506 Categorization::Local(vid) => self.reassigned |= self.node == vid,
1507 Categorization::Interior(ref base_cmt, mc::InteriorField(field)) => {
1a4d82fc 1508 match base_cmt.cat {
92a42be0
SL
1509 Categorization::Upvar(mc::Upvar { id: ty::UpvarId { var_id: vid, .. }, .. }) |
1510 Categorization::Local(vid) => {
c1a9b12d
SL
1511 self.reassigned |= self.node == vid &&
1512 (self.field.is_none() || Some(field) == self.field)
1a4d82fc
JJ
1513 },
1514 _ => {}
1515 }
1516 },
1517 _ => {}
1518 }
1519 }
1520}
1521
e9174d1e
SL
1522fn create_bindings_map<'blk, 'tcx>(bcx: Block<'blk, 'tcx>, pat: &hir::Pat,
1523 discr: &hir::Expr, body: &hir::Expr)
1a4d82fc
JJ
1524 -> BindingsMap<'tcx> {
1525 // Create the bindings map, which is a mapping from each binding name
1526 // to an alloca() that will be the value for that local variable.
1527 // Note that we use the names because each binding will have many ids
1528 // from the various alternatives.
1529 let ccx = bcx.ccx();
1530 let tcx = bcx.tcx();
1531 let reassigned = is_discr_reassigned(bcx, discr, body);
85aaf69f 1532 let mut bindings_map = FnvHashMap();
7453a54e 1533 pat_bindings(&tcx.def_map, &pat, |bm, p_id, span, path1| {
92a42be0 1534 let name = path1.node;
1a4d82fc
JJ
1535 let variable_ty = node_id_type(bcx, p_id);
1536 let llvariable_ty = type_of::type_of(ccx, variable_ty);
1537 let tcx = bcx.tcx();
c1a9b12d 1538 let param_env = tcx.empty_parameter_environment();
1a4d82fc
JJ
1539
1540 let llmatch;
1541 let trmode;
c1a9b12d 1542 let moves_by_default = variable_ty.moves_by_default(&param_env, span);
1a4d82fc 1543 match bm {
e9174d1e 1544 hir::BindByValue(_) if !moves_by_default || reassigned =>
1a4d82fc 1545 {
e9174d1e
SL
1546 llmatch = alloca(bcx, llvariable_ty.ptr_to(), "__llmatch");
1547 let llcopy = alloca(bcx, llvariable_ty, &bcx.name(name));
c1a9b12d
SL
1548 trmode = if moves_by_default {
1549 TrByMoveIntoCopy(llcopy)
1550 } else {
1551 TrByCopy(llcopy)
1552 };
1a4d82fc 1553 }
e9174d1e 1554 hir::BindByValue(_) => {
1a4d82fc
JJ
1555 // in this case, the final type of the variable will be T,
1556 // but during matching we need to store a *T as explained
1557 // above
e9174d1e 1558 llmatch = alloca(bcx, llvariable_ty.ptr_to(), &bcx.name(name));
c1a9b12d 1559 trmode = TrByMoveRef;
1a4d82fc 1560 }
e9174d1e
SL
1561 hir::BindByRef(_) => {
1562 llmatch = alloca(bcx, llvariable_ty, &bcx.name(name));
1a4d82fc
JJ
1563 trmode = TrByRef;
1564 }
1565 };
b039eaaf 1566 bindings_map.insert(name, BindingInfo {
1a4d82fc
JJ
1567 llmatch: llmatch,
1568 trmode: trmode,
1569 id: p_id,
1570 span: span,
1571 ty: variable_ty
1572 });
1573 });
1574 return bindings_map;
1575}
1576
1577fn trans_match_inner<'blk, 'tcx>(scope_cx: Block<'blk, 'tcx>,
1578 match_id: ast::NodeId,
e9174d1e
SL
1579 discr_expr: &hir::Expr,
1580 arms: &[hir::Arm],
1a4d82fc
JJ
1581 dest: Dest) -> Block<'blk, 'tcx> {
1582 let _icx = push_ctxt("match::trans_match_inner");
1583 let fcx = scope_cx.fcx;
1584 let mut bcx = scope_cx;
1585 let tcx = bcx.tcx();
1586
1587 let discr_datum = unpack_datum!(bcx, expr::trans_to_lvalue(bcx, discr_expr,
1588 "match"));
1589 if bcx.unreachable.get() {
1590 return bcx;
1591 }
1592
1593 let t = node_id_type(bcx, discr_expr.id);
c1a9b12d 1594 let chk = if t.is_empty(tcx) {
1a4d82fc
JJ
1595 Unreachable
1596 } else {
1597 Infallible
1598 };
1599
1600 let arm_datas: Vec<ArmData> = arms.iter().map(|arm| ArmData {
1601 bodycx: fcx.new_id_block("case_body", arm.body.id),
1602 arm: arm,
7453a54e 1603 bindings_map: create_bindings_map(bcx, &arm.pats[0], discr_expr, &arm.body)
1a4d82fc
JJ
1604 }).collect();
1605
85aaf69f
SL
1606 let mut pat_renaming_map = if scope_cx.sess().opts.debuginfo != NoDebugInfo {
1607 Some(FnvHashMap())
1608 } else {
1609 None
1610 };
1611
e9174d1e 1612 let arm_pats: Vec<Vec<P<hir::Pat>>> = {
85aaf69f
SL
1613 let mut static_inliner = StaticInliner::new(scope_cx.tcx(),
1614 pat_renaming_map.as_mut());
1615 arm_datas.iter().map(|arm_data| {
1616 arm_data.arm.pats.iter().map(|p| static_inliner.fold_pat((*p).clone())).collect()
1617 }).collect()
1618 };
1619
1a4d82fc 1620 let mut matches = Vec::new();
62682a34 1621 for (arm_data, pats) in arm_datas.iter().zip(&arm_pats) {
1a4d82fc 1622 matches.extend(pats.iter().map(|p| Match {
7453a54e 1623 pats: vec![&p],
1a4d82fc
JJ
1624 data: arm_data,
1625 bound_ptrs: Vec::new(),
85aaf69f 1626 pat_renaming_map: pat_renaming_map.as_ref()
1a4d82fc
JJ
1627 }));
1628 }
1629
1630 // `compile_submatch` works one column of arm patterns a time and
1631 // then peels that column off. So as we progress, it may become
1632 // impossible to tell whether we have a genuine default arm, i.e.
1633 // `_ => foo` or not. Sometimes it is important to know that in order
1634 // to decide whether moving on to the next condition or falling back
1635 // to the default arm.
1636 let has_default = arms.last().map_or(false, |arm| {
1637 arm.pats.len() == 1
7453a54e 1638 && arm.pats.last().unwrap().node == PatKind::Wild
1a4d82fc
JJ
1639 });
1640
c1a9b12d 1641 compile_submatch(bcx, &matches[..], &[discr_datum.match_input()], &chk, has_default);
1a4d82fc
JJ
1642
1643 let mut arm_cxs = Vec::new();
85aaf69f 1644 for arm_data in &arm_datas {
1a4d82fc
JJ
1645 let mut bcx = arm_data.bodycx;
1646
1647 // insert bindings into the lllocals map and add cleanups
1648 let cs = fcx.push_custom_cleanup_scope();
1649 bcx = insert_lllocals(bcx, &arm_data.bindings_map, Some(cleanup::CustomScope(cs)));
7453a54e 1650 bcx = expr::trans_into(bcx, &arm_data.arm.body, dest);
1a4d82fc
JJ
1651 bcx = fcx.pop_and_trans_custom_cleanup_scope(bcx, cs);
1652 arm_cxs.push(bcx);
1653 }
1654
85aaf69f 1655 bcx = scope_cx.fcx.join_blocks(match_id, &arm_cxs[..]);
1a4d82fc
JJ
1656 return bcx;
1657}
1658
1659/// Generates code for a local variable declaration like `let <pat>;` or `let <pat> =
1660/// <opt_init_expr>`.
1661pub fn store_local<'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
e9174d1e 1662 local: &hir::Local)
1a4d82fc
JJ
1663 -> Block<'blk, 'tcx> {
1664 let _icx = push_ctxt("match::store_local");
1665 let mut bcx = bcx;
1666 let tcx = bcx.tcx();
7453a54e 1667 let pat = &local.pat;
1a4d82fc
JJ
1668
1669 fn create_dummy_locals<'blk, 'tcx>(mut bcx: Block<'blk, 'tcx>,
e9174d1e 1670 pat: &hir::Pat)
1a4d82fc 1671 -> Block<'blk, 'tcx> {
c34b1796 1672 let _icx = push_ctxt("create_dummy_locals");
1a4d82fc
JJ
1673 // create dummy memory for the variables if we have no
1674 // value to store into them immediately
1675 let tcx = bcx.tcx();
1676 pat_bindings(&tcx.def_map, pat, |_, p_id, _, path1| {
1677 let scope = cleanup::var_scope(tcx, p_id);
1678 bcx = mk_binding_alloca(
b039eaaf 1679 bcx, p_id, path1.node, scope, (),
c1a9b12d
SL
1680 "_match::store_local::create_dummy_locals",
1681 |(), bcx, Datum { val: llval, ty, kind }| {
1682 // Dummy-locals start out uninitialized, so set their
1683 // drop-flag hints (if any) to "moved."
1684 if let Some(hint) = kind.dropflag_hint(bcx) {
e9174d1e 1685 let moved_hint = adt::DTOR_MOVED_HINT;
c1a9b12d
SL
1686 debug!("store moved_hint={} for hint={:?}, uninitialized dummy",
1687 moved_hint, hint);
1688 Store(bcx, C_u8(bcx.fcx.ccx, moved_hint), hint.to_value().value());
1689 }
1690
1691 if kind.drop_flag_info.must_zero() {
1692 // if no drop-flag hint, or the hint requires
1693 // we maintain the embedded drop-flag, then
1694 // mark embedded drop-flag(s) as moved
1695 // (i.e. "already dropped").
1696 drop_done_fill_mem(bcx, llval, ty);
1697 }
1698 bcx
1699 });
1a4d82fc
JJ
1700 });
1701 bcx
1702 }
1703
1704 match local.init {
1705 Some(ref init_expr) => {
1706 // Optimize the "let x = expr" case. This just writes
1707 // the result of evaluating `expr` directly into the alloca
1708 // for `x`. Often the general path results in similar or the
1709 // same code post-optimization, but not always. In particular,
1710 // in unsafe code, you can have expressions like
1711 //
1712 // let x = intrinsics::uninit();
1713 //
1714 // In such cases, the more general path is unsafe, because
1715 // it assumes it is matching against a valid value.
b039eaaf
SL
1716 match simple_name(pat) {
1717 Some(name) => {
1a4d82fc
JJ
1718 let var_scope = cleanup::var_scope(tcx, local.id);
1719 return mk_binding_alloca(
b039eaaf 1720 bcx, pat.id, name, var_scope, (),
c1a9b12d 1721 "_match::store_local",
7453a54e 1722 |(), bcx, Datum { val: v, .. }| expr::trans_into(bcx, &init_expr,
c1a9b12d 1723 expr::SaveIn(v)));
1a4d82fc
JJ
1724 }
1725
1726 None => {}
1727 }
1728
1729 // General path.
1730 let init_datum =
7453a54e 1731 unpack_datum!(bcx, expr::trans_to_lvalue(bcx, &init_expr, "let"));
1a4d82fc
JJ
1732 if bcx.sess().asm_comments() {
1733 add_comment(bcx, "creating zeroable ref llval");
1734 }
1735 let var_scope = cleanup::var_scope(tcx, local.id);
c1a9b12d 1736 bind_irrefutable_pat(bcx, pat, init_datum.match_input(), var_scope)
1a4d82fc
JJ
1737 }
1738 None => {
1739 create_dummy_locals(bcx, pat)
1740 }
1741 }
1742}
1743
1a4d82fc
JJ
1744fn mk_binding_alloca<'blk, 'tcx, A, F>(bcx: Block<'blk, 'tcx>,
1745 p_id: ast::NodeId,
9346a6ac 1746 name: ast::Name,
1a4d82fc
JJ
1747 cleanup_scope: cleanup::ScopeId,
1748 arg: A,
c1a9b12d 1749 caller_name: &'static str,
1a4d82fc
JJ
1750 populate: F)
1751 -> Block<'blk, 'tcx> where
c1a9b12d 1752 F: FnOnce(A, Block<'blk, 'tcx>, Datum<'tcx, Lvalue>) -> Block<'blk, 'tcx>,
1a4d82fc
JJ
1753{
1754 let var_ty = node_id_type(bcx, p_id);
1755
1756 // Allocate memory on stack for the binding.
9346a6ac 1757 let llval = alloc_ty(bcx, var_ty, &bcx.name(name));
c1a9b12d
SL
1758 let lvalue = Lvalue::new_with_hint(caller_name, bcx, p_id, HintKind::DontZeroJustUse);
1759 let datum = Datum::new(llval, var_ty, lvalue);
1a4d82fc 1760
9cc50fc6
SL
1761 debug!("mk_binding_alloca cleanup_scope={:?} llval={} var_ty={:?}",
1762 cleanup_scope, bcx.ccx().tn().val_to_string(llval), var_ty);
1763
1a4d82fc
JJ
1764 // Subtle: be sure that we *populate* the memory *before*
1765 // we schedule the cleanup.
e9174d1e 1766 call_lifetime_start(bcx, llval);
c1a9b12d 1767 let bcx = populate(arg, bcx, datum);
1a4d82fc 1768 bcx.fcx.schedule_lifetime_end(cleanup_scope, llval);
c1a9b12d 1769 bcx.fcx.schedule_drop_mem(cleanup_scope, llval, var_ty, lvalue.dropflag_hint(bcx));
1a4d82fc
JJ
1770
1771 // Now that memory is initialized and has cleanup scheduled,
c1a9b12d 1772 // insert datum into the local variable map.
1a4d82fc
JJ
1773 bcx.fcx.lllocals.borrow_mut().insert(p_id, datum);
1774 bcx
1775}
1776
1777/// A simple version of the pattern matching code that only handles
1778/// irrefutable patterns. This is used in let/argument patterns,
1779/// not in match statements. Unifying this code with the code above
1780/// sounds nice, but in practice it produces very inefficient code,
1781/// since the match code is so much more general. In most cases,
1782/// LLVM is able to optimize the code, but it causes longer compile
1783/// times and makes the generated code nigh impossible to read.
1784///
1785/// # Arguments
1786/// - bcx: starting basic block context
1787/// - pat: the irrefutable pattern being matched.
1788/// - val: the value being matched -- must be an lvalue (by ref, with cleanup)
c1a9b12d 1789pub fn bind_irrefutable_pat<'blk, 'tcx>(bcx: Block<'blk, 'tcx>,
e9174d1e 1790 pat: &hir::Pat,
c1a9b12d 1791 val: MatchInput,
1a4d82fc
JJ
1792 cleanup_scope: cleanup::ScopeId)
1793 -> Block<'blk, 'tcx> {
92a42be0 1794 debug!("bind_irrefutable_pat(bcx={}, pat={:?}, val={})",
1a4d82fc 1795 bcx.to_str(),
92a42be0
SL
1796 pat,
1797 bcx.val_to_string(val.val));
1a4d82fc
JJ
1798
1799 if bcx.sess().asm_comments() {
62682a34
SL
1800 add_comment(bcx, &format!("bind_irrefutable_pat(pat={:?})",
1801 pat));
1a4d82fc
JJ
1802 }
1803
1804 let _indenter = indenter();
1805
1806 let _icx = push_ctxt("match::bind_irrefutable_pat");
1807 let mut bcx = bcx;
1808 let tcx = bcx.tcx();
1809 let ccx = bcx.ccx();
1810 match pat.node {
7453a54e
SL
1811 PatKind::Ident(pat_binding_mode, ref path1, ref inner) => {
1812 if pat_is_binding(&tcx.def_map.borrow(), &pat) {
1a4d82fc
JJ
1813 // Allocate the stack slot where the value of this
1814 // binding will live and place it into the appropriate
1815 // map.
1816 bcx = mk_binding_alloca(
9346a6ac 1817 bcx, pat.id, path1.node.name, cleanup_scope, (),
c1a9b12d
SL
1818 "_match::bind_irrefutable_pat",
1819 |(), bcx, Datum { val: llval, ty, kind: _ }| {
1a4d82fc 1820 match pat_binding_mode {
e9174d1e 1821 hir::BindByValue(_) => {
1a4d82fc
JJ
1822 // By value binding: move the value that `val`
1823 // points at into the binding's stack slot.
c1a9b12d 1824 let d = val.to_datum(ty);
1a4d82fc
JJ
1825 d.store_to(bcx, llval)
1826 }
1827
e9174d1e 1828 hir::BindByRef(_) => {
1a4d82fc 1829 // By ref binding: the value of the variable
c34b1796
AL
1830 // is the pointer `val` itself or fat pointer referenced by `val`
1831 if type_is_fat_ptr(bcx.tcx(), ty) {
c1a9b12d 1832 expr::copy_fat_ptr(bcx, val.val, llval);
c34b1796
AL
1833 }
1834 else {
c1a9b12d 1835 Store(bcx, val.val, llval);
c34b1796
AL
1836 }
1837
1a4d82fc
JJ
1838 bcx
1839 }
1840 }
1841 });
1842 }
1843
85aaf69f 1844 if let Some(ref inner_pat) = *inner {
7453a54e 1845 bcx = bind_irrefutable_pat(bcx, &inner_pat, val, cleanup_scope);
1a4d82fc
JJ
1846 }
1847 }
7453a54e 1848 PatKind::TupleStruct(_, ref sub_pats) => {
c34b1796 1849 let opt_def = bcx.tcx().def_map.borrow().get(&pat.id).map(|d| d.full_def());
1a4d82fc 1850 match opt_def {
7453a54e 1851 Some(Def::Variant(enum_id, var_id)) => {
1a4d82fc 1852 let repr = adt::represent_node(bcx, pat.id);
e9174d1e 1853 let vinfo = ccx.tcx().lookup_adt_def(enum_id).variant_with_id(var_id);
1a4d82fc 1854 let args = extract_variant_args(bcx,
7453a54e 1855 &repr,
9cc50fc6 1856 Disr::from(vinfo.disr_val),
1a4d82fc 1857 val);
85aaf69f 1858 if let Some(ref sub_pat) = *sub_pats {
1a4d82fc 1859 for (i, &argval) in args.vals.iter().enumerate() {
c1a9b12d
SL
1860 bcx = bind_irrefutable_pat(
1861 bcx,
7453a54e 1862 &sub_pat[i],
c1a9b12d
SL
1863 MatchInput::from_val(argval),
1864 cleanup_scope);
1a4d82fc
JJ
1865 }
1866 }
1867 }
7453a54e 1868 Some(Def::Struct(..)) => {
1a4d82fc
JJ
1869 match *sub_pats {
1870 None => {
1871 // This is a unit-like struct. Nothing to do here.
1872 }
1873 Some(ref elems) => {
1874 // This is the tuple struct case.
1875 let repr = adt::represent_node(bcx, pat.id);
92a42be0 1876 let val = adt::MaybeSizedValue::sized(val.val);
1a4d82fc 1877 for (i, elem) in elems.iter().enumerate() {
7453a54e 1878 let fldptr = adt::trans_field_ptr(bcx, &repr,
9cc50fc6 1879 val, Disr(0), i);
c1a9b12d
SL
1880 bcx = bind_irrefutable_pat(
1881 bcx,
7453a54e 1882 &elem,
c1a9b12d
SL
1883 MatchInput::from_val(fldptr),
1884 cleanup_scope);
1a4d82fc
JJ
1885 }
1886 }
1887 }
1888 }
1889 _ => {
1890 // Nothing to do here.
1891 }
1892 }
1893 }
7453a54e 1894 PatKind::Struct(_, ref fields, _) => {
1a4d82fc
JJ
1895 let tcx = bcx.tcx();
1896 let pat_ty = node_id_type(bcx, pat.id);
1897 let pat_repr = adt::represent_type(bcx.ccx(), pat_ty);
e9174d1e 1898 let pat_v = VariantInfo::of_node(tcx, pat_ty, pat.id);
92a42be0
SL
1899
1900 let val = if type_is_sized(tcx, pat_ty) {
1901 adt::MaybeSizedValue::sized(val.val)
1902 } else {
1903 let data = Load(bcx, expr::get_dataptr(bcx, val.val));
1904 let meta = Load(bcx, expr::get_meta(bcx, val.val));
1905 adt::MaybeSizedValue::unsized_(data, meta)
1906 };
1907
e9174d1e 1908 for f in fields {
b039eaaf 1909 let name = f.node.name;
92a42be0
SL
1910 let field_idx = pat_v.field_index(name);
1911 let mut fldptr = adt::trans_field_ptr(
e9174d1e 1912 bcx,
7453a54e 1913 &pat_repr,
92a42be0 1914 val,
e9174d1e 1915 pat_v.discr,
92a42be0
SL
1916 field_idx);
1917
1918 let fty = pat_v.fields[field_idx].1;
1919 // If it's not sized, then construct a fat pointer instead of
1920 // a regular one
1921 if !type_is_sized(tcx, fty) {
1922 let scratch = alloc_ty(bcx, fty, "__struct_field_fat_ptr");
1923 debug!("Creating fat pointer {}", bcx.val_to_string(scratch));
1924 Store(bcx, fldptr, expr::get_dataptr(bcx, scratch));
1925 Store(bcx, val.meta, expr::get_meta(bcx, scratch));
1926 fldptr = scratch;
1927 }
e9174d1e 1928 bcx = bind_irrefutable_pat(bcx,
7453a54e 1929 &f.node.pat,
e9174d1e
SL
1930 MatchInput::from_val(fldptr),
1931 cleanup_scope);
1932 }
1a4d82fc 1933 }
7453a54e 1934 PatKind::Tup(ref elems) => {
1a4d82fc 1935 let repr = adt::represent_node(bcx, pat.id);
92a42be0 1936 let val = adt::MaybeSizedValue::sized(val.val);
1a4d82fc 1937 for (i, elem) in elems.iter().enumerate() {
7453a54e 1938 let fldptr = adt::trans_field_ptr(bcx, &repr, val, Disr(0), i);
c1a9b12d
SL
1939 bcx = bind_irrefutable_pat(
1940 bcx,
7453a54e 1941 &elem,
c1a9b12d
SL
1942 MatchInput::from_val(fldptr),
1943 cleanup_scope);
1a4d82fc
JJ
1944 }
1945 }
7453a54e 1946 PatKind::Box(ref inner) => {
92a42be0 1947 let pat_ty = node_id_type(bcx, inner.id);
9cc50fc6
SL
1948 // Pass along DSTs as fat pointers.
1949 let val = if type_is_fat_ptr(tcx, pat_ty) {
1950 // We need to check for this, as the pattern could be binding
1951 // a fat pointer by-value.
7453a54e 1952 if let PatKind::Ident(hir::BindByRef(_),_,_) = inner.node {
9cc50fc6
SL
1953 val.val
1954 } else {
1955 Load(bcx, val.val)
1956 }
1957 } else if type_is_sized(tcx, pat_ty) {
92a42be0
SL
1958 Load(bcx, val.val)
1959 } else {
1960 val.val
1961 };
c1a9b12d 1962 bcx = bind_irrefutable_pat(
7453a54e 1963 bcx, &inner, MatchInput::from_val(val), cleanup_scope);
1a4d82fc 1964 }
7453a54e 1965 PatKind::Ref(ref inner, _) => {
92a42be0 1966 let pat_ty = node_id_type(bcx, inner.id);
9cc50fc6
SL
1967 // Pass along DSTs as fat pointers.
1968 let val = if type_is_fat_ptr(tcx, pat_ty) {
1969 // We need to check for this, as the pattern could be binding
1970 // a fat pointer by-value.
7453a54e 1971 if let PatKind::Ident(hir::BindByRef(_),_,_) = inner.node {
9cc50fc6
SL
1972 val.val
1973 } else {
1974 Load(bcx, val.val)
1975 }
1976 } else if type_is_sized(tcx, pat_ty) {
92a42be0
SL
1977 Load(bcx, val.val)
1978 } else {
1979 val.val
1980 };
c1a9b12d
SL
1981 bcx = bind_irrefutable_pat(
1982 bcx,
7453a54e 1983 &inner,
92a42be0 1984 MatchInput::from_val(val),
c1a9b12d 1985 cleanup_scope);
1a4d82fc 1986 }
7453a54e 1987 PatKind::Vec(ref before, ref slice, ref after) => {
1a4d82fc
JJ
1988 let pat_ty = node_id_type(bcx, pat.id);
1989 let mut extracted = extract_vec_elems(bcx, pat_ty, before.len(), after.len(), val);
1990 match slice {
1991 &Some(_) => {
1992 extracted.vals.insert(
1993 before.len(),
1994 bind_subslice_pat(bcx, pat.id, val, before.len(), after.len())
1995 );
1996 }
1997 &None => ()
1998 }
1999 bcx = before
2000 .iter()
2001 .chain(slice.iter())
2002 .chain(after.iter())
62682a34 2003 .zip(extracted.vals)
c1a9b12d
SL
2004 .fold(bcx, |bcx, (inner, elem)| {
2005 bind_irrefutable_pat(
2006 bcx,
7453a54e 2007 &inner,
c1a9b12d
SL
2008 MatchInput::from_val(elem),
2009 cleanup_scope)
2010 });
1a4d82fc 2011 }
7453a54e
SL
2012 PatKind::Path(..) | PatKind::QPath(..) | PatKind::Wild | PatKind::Lit(_) |
2013 PatKind::Range(_, _) => ()
1a4d82fc
JJ
2014 }
2015 return bcx;
2016}