| T.R | Title | User | Personal Name
 | Date | Lines | 
|---|
| 1265.1 |  | GUESS::DERAMO | Colorado Rocky Mountain high | Tue Jul 03 1990 10:18 | 9 | 
|  | >>    	3)  What are "upper envelopes of sets of line segments"?
	Perhaps the convex hull of the points of intersection
	and endpoints of the segments.  What I thought of when
	I saw the term was a parabola with say a dozen tangents
	drawn against it, making the outline of a polygon that
	"hugs" the parabola.
	Dan
 | 
| 1265.2 | iterated hulls... | ALLVAX::JROTH | It's a bush recording... | Fri Jul 06 1990 16:25 | 21 | 
|  |    I'm not sure of the exact "practical" use of onion peeling, though it is
   somewhat related to gift wrapping for determining convex hulls in
   reverse.  It is sometimes referred to as the "iterated hull".
   I believe the best known time is O(n log(n)^2) to get the onion of
   a set of points.
   Probably it is being studied because it sheds light on some
   fundamental complexity issues in computational geometry.
   The upper envelopes of line segments is just what one would think
   it is - kind of an upper convex hull.  This is if interest because
   of the popularity of plane-sweep algorithms.  If it could be
   determined very efficiently, then plane sweeping could be done
   faster.
   I assume the interest in parallelism comes from things like the
   connection machine.  A lot of practical problems such as VLSI layout
   and routine would be sped quite a bit in practice if they were
   amenable to running on parallel processors.
   - Jim
 | 
| 1265.3 | Got some answers... | CHOVAX::YOUNG | Bush: 'Read my lips; I Lied' | Sat Jul 07 1990 12:04 | 14 | 
|  |     I talked to Lyle Ramshaw yesterday and got answers to most of my
    questions, especially what is meant by "Fast in Parallel".
    
    Apparently a problem may be considered "Fast" in parallel if it can be
    done in "Polylog" time with an arbitrarily large number of processors.
    
    "Polylog" means O(log(n)^k) where (k) may be any positive constant.
    
    Re .2:
    
    O(n log(n)^2) is pretty good.  Do you have any idea how this algorithim
    works?  The best that I could come up with was O(n^2 log(n)).
    
    --  Barry
 |