Freesteel » Machining

Thursday, February 28th, 2013 at 7:01 pm - Julian - Machining

My overwhelming feat of insignificance

Work that is not productive, such as duplicating someone else’s work, subtracts from work that is productive. This matters when you are getting old and tired and no longer able to catch up on mistakes by programming all hours of the night.

I feel it is necessary to spend excessive time working out what needs to be done due to the dire shortcomings of the internal comms system. It’s easy to know where you are in a small company, because there are so few of you and the coverage is predictable. Duplication of other work (done by people in another company) takes place consciously and strategically. But in a large organization you can find 20 people smarter and younger than you each day, and there is no way that one of them hasn’t already programmed what you are about to waste the next two weeks programming. What is it again? Have I got a triangle colliding with a cone that is being swung on the end of a stick? I can easily enter a state of paralysis.

I continually look for clues everywhere. Last night I listened to the 4th quarter earnings report where the CEO read out his deadly dull prepared statements, and then got quizzed by the owners of the company.

(more…)

Sunday, February 10th, 2013 at 4:54 pm - Julian - Machining 2 Comments »

Forces of Production

A tip-off from a Noam Chomsky interview lead me to the book Forces of Production A social history of industrial automation by David F Noble. The core of the book is a history of N/C (Numerical Control) Machining from its development at M.I.T. in the 50′s and the almost total funding of it by the US government through the Department of Defence for the next couple of decades in spite of the fact that it was economically impractical due to (a) unreliable and complicated electronics, and (b) computers for calculating the toolpaths were too expensive. It did, however, have the advantage that it promised to do away with skilled machinists who were able to bargain for better wages and so forth. Other more practical technologies, such as R/P (Record Playback of real motions onto a magnetic tape the same way industrial robots have often been programmed), were dismissed, defunded and suppressed by various measures. For example, the use of the totally over-engineered N/C programming language APT was made a precondition for government contracts. (Just to be really annoying, the specification of APT was only available to AIA members.) At least the Defence Department has form; they did it again when they standardized on the equally over-engineered programming language ADA. (I had never seen the point of APT, having only ever worked with G-code.)

Anyhow, the thesis of the book is arguably far-fetched and Marxist (though extremely well researched). But then you look up the citation on p219 for the following Exhibit of a 1963 United States Air Force promotional film on numerical control entitled: Modern Manufacturing: A Command Performance.

Be sure to watch all of it, including the part about the primitive techniques of manual machine tool operation illustrated by a black man in a grass hut.

N/C machining equals nuclear missiles equals Cold War which justified unlimited government dollars to top universities such as MIT to do their interesting advanced research, which provided no reason for their staff, who made up the intellectual elite of the nation, to ever question the insane dance of death known as the Cold War. The sad thing about the Cold War is it’s not even necessary — the public still funds unbelievable levels of spending on the same useless projects more than two decades after any viable pretext has existed. Who could have known? Maybe it’s also possible to obtain sufficient public funding for technological development not through the military budget. I bet there’s a way. And these university grant-scrabbling geniuses should be able to work it out if they weren’t so intellectually lazy and self-centred.

Monday, January 14th, 2013 at 6:56 pm - Julian - Machining

Can I Autodesk the house?

Carl Bass (CEO of Autodesk) sent a global email across the company announcing that he had made it easy for employees to install and use any Autodesk product on their computer.

Really? I thought. It’s a little late for a tech company to wake up to the Eat your own dog food principle. Better late than never. One’s got to participate and not leave everything to the boss.

Quote from an interview with Bloomberg.com in 2010:

By using the software himself, [Bass] is also trying to anticipate customers’ complaints.

“You install it, and say, ‘Why did it take me 40 minutes to do that? Why did it ask me 72 questions?’” Bass said.

…Bass is trying to enhance the software, which costs thousands of dollars, to give it a slicker look and more intuitive feel.

“It always seemed a shame to me that we might sell a $5,000 piece of software that doesn’t look as good as a $49 video game,” Bass said.

So I installed said £5000 piece of software called Autodesk Revit Architecture 2013, and decided that a good project would be to make an architectual model of my sandstone house in Liverpool.

But how am I going to learn to use this software? I don’t know architecture so I don’t know any of those unwritten conventions that are part of the process. I kept getting these crappy pop-up dialog boxes telling me there was an error when I clicked [ok]. There are slicker ways to signal it. Though I expect if you were an architect who had just paid £5k for this software, you wouldn’t be making these sorts of mistakes. There’s no such thing as a beginners instruction manual for operating an electron microscope, is there?

I thought: Wouldn’t it be nice if I could find a dozen or so other employees across the company also taking part in this dogfooding program with the same idea, and we could all be motivated to press on with modelling our own houses and showing our progress, like a bunch of folks in a novel writing group?

There seemed to be no means of communication to solicit for such participants, so I tried a reply to sender on the email.

And got an email back from Mr Bass within minutes.

This is a slightly better response rate than I have had so far to my questions on the internal Ask CEO Staff website. These tend to languish for many weeks before a wholly inadequate reply that for reasons best known to themselves the CEO Staff are too chicken to allow to be posted. I mean, what kinds of questions do they expect people to ask?

Anyways, having bothered the boss, I might as well try to carry this idea on.

After playing with the software a bit, it was clear I needed a footprint profile of the house.

I got Becka outside with her DistoX technology (electronic compass, clino and laser measuring device) that we use to survey caves, and did a circuit of the building.

Here is the data from the unit (interpreted by TunnelX):

*begin household

;;; TRIP COMMENT FROM POCKETTOPO ;;;

*date 2013.01.05

*data normal from to tape compass clino ignoreall

;from    to    tape(m) compass  clino
1-0	1-1	1.295	159.5	-2.8
1-1	1-2	2.159	70.4	0.4
1-2	1-3	3.094	161.5	1.3
1-3	1-4	5.497	56.3	10.8
1-4	1-5	3.382	338.7	-23.4
1-5	1-6	2.210	70.4	8.7
1-6	1-7	3.923	340.8	-0.1
1-7	1-8	2.293	268.7	7.2
1-8	1-9	2.452	321.8  -7.1
1-9	1-10	5.397	257.9	-0.1 
1-10	1-11	3.109	158.1	0.1
1-11	1-12	2.179	234.2	-0.1
1-12	1-13	2.642	152.6	-0.3 

*end household

spacer

We did it at about shoulder height, except in the southeast corner where we needed to get the laser point over the fence, hence the extra 1m in height.

Not very accurate, is it?

That’s because there are too many large metal structures associated with the building that were buggering up the compass readings. Clearly, this survey technology is only appropriate in a cave where there is no metal. On a building site, they probably use GPS and/or triangulation.

To salvage this survey, I forced all the compass readings to be the closest to one of 70, 160, 250 or 340 degrees.

This is what it came out like:

1-0	1-1	1.295	160	-2.8
1-1	1-2	2.159	70	0.4
1-2	1-3	3.094	160	1.3
1-3	1-4	5.497	70	10.8
1-4	1-5	3.382	340	-23.4
1-5	1-6	2.210	70	8.7
1-6	1-7	3.923	340	-0.1

; put break here because non orthogonal leg
;1-7	1-8	2.293	268.7	7.2
1-7	1-8a	2.293	268.7	7.2

1-8	1-9	2.452	340	-7.1 
1-9	1-10	5.397	250	-0.1 
1-10	1-11	3.109	160	0.1
1-11	1-12	2.179	250	-0.1
1-12	1-13	2.642	160	-0.3 

*equate 1-13 1-0  ; loop close

spacer

Almost perfect. The northeast internal corner that isn’t square was where we were avoiding a downpipe and water butt.

Architecture software seems to rely on the stuff being at right angles, so a rotation of 20 degrees is required to get it ready. But that’s it.

This does suggest we could make it work by using the compass to determin only the cardinal direction, so it only needs to be good to the nearest 40 degrees. Then we could blip blip blip around all the interior walls of the house — including ceilings and floors because the clino is still dead good — and produce an effective plan pretty quickly!

Then I could plug everything I know into their ecotect analysis something-something software to find out what I need to do, and actually participate in their technology trend #1 from their labs blog:

Reality capture (laser scanning, photogrammetry)

They are pleased with their two minute video of a talking head. “This is two and a half minutes well spent,” adds the CTO. All very well, but how about some examples?

At the moment, all I’m doing is hammering holes in the kitchen floor to find out what’s underneath.

Answer: It’s not at all what I expected.

spacer spacer

Sand and rocks go down as deep as I can reach beneath parallel steel bars supporting the floor. Maybe there is a whole basement down there. You never know with these ancient houses.

Going to Cambridge tomorrow. Maybe there are some folks in the office there who are up for this and can teach me how to Revit.

Thursday, January 10th, 2013 at 6:30 pm - Julian - Adaptive

The toolpath collision point with an engagement sweep

Have just checked in 400 lines of the function CheckPathSegmentCollisionSweepArc() to go with the 150 lines I had done in CheckPathSegmentCollisionSweepLinesegment(). The line counts are a bit unfair because there are several sets of duplicated code for the clockwise and anticlockwise, in and out configurations.

This is to do with the new feature of toolpath re-ordering in the Adaptive Clearing algorithm, the culmination of the ideas I worked out with Martin on the train back from Copenhagen in October.

One of the fundamental routines requires considering a length of cutting toolpath P and comparing it with a length of cutting toolpath Q and its adjacent linking toolpaths from later on in the total toolpath, and deciding whether it is safe to do Q before P instead of after it.

Basically, it’s safe to reorder so long as the circular profile of the tool as it sweeps along the Q path does not hit any of the material removed by P.

Our first approximation of the function simply returned True when the bounding boxes of Q and P were separated by more than the sum of the two tool radii, and was good enough to demonstrate that the rest of the re-ordering feature was going to work. (I’ll outline that deceptively simple algorithm at a later date.)

But the results would be tighter the better we approximate of the material removed by the path P.

Our first attempt used the measured engagement value set at every node of the path P as a byproduct of the Adaptive Clearing algorithm.

The plots of it were beautiful, with the sequence of engagement arcs looking exactly like the scores on the metal, but I failed to take a screenshot of them. The problem was determining how frequently one should sample along the toolpath. I don’t like fudging matters by just plucking a small number out of the air that would probably work, but which would leave an unnecessary upper bound for performance. And I couldn’t guarantee that the engagement width along the toolpath between its nodes was never spike up to just below the maximum engagement angle, and then back down to a small value before the next node.

spacer

So I decided to assume the value was always the maximum engagement angle along the toolpath — which is already sampled at the widest rate possible specifically to not exceed this engagement angle. That makes it easy. It is an over-estimate, but it is controlled. Most of the time the engagement angle is close to the optimum. And when you want to favour speed of calculation over accuracy you leave a wider margin between the maximum and the optimum. And so it is reasonable for the approximation of the material removal area to degrade correspondingly (though always erring on the side of caution). When you tighten things up, everything should tighten. When they loosen, all parts should loosen.

The geometry is explained in the picture above. We are looking for the point along path Q where the tool profile collides with the material removed by path P. When Q is a cutting path, we just need to know Yes or No. When it’s No we cannot reorder Q before P. When Q is a linking path, we need to trim it back so that we can use the safe part of it to guide the toolpath away from the stock and calculate a new linking move that avoids all previously uncut stock.

Obviously we break Q down into line segments and arcs, and P down into arc and two line components called fsengagementsegsweep which is composed of the red arc (with its two end points) and the two parallel orange lines whose end points are embedded within the arcs and so do not need to be tested. There are 5 different ways for the circle profile to be in contact with an fsengagementsegsweep.

spacer

The function CheckPathSegmentCollisionSweepLinesegment() dealt with the line segment linear motion of one of those purple disks along Q to the soonest a point where it makes contact in one of the five ways with the fsengagementsegsweep‘s generated by path P.

But then what to do with the arcs on the path Q (usually quite big ones when they involve linking motions)?

The first version is just to approximate the arcs to short line segments and call CheckPathSegmentCollisionSweepLinesegment() for each one of them. But that’s horrible. What tolerance do you use? There’s no right answer, so what you choose will forever be obstructed from optimality.

Given that it is a soluble problem — circles on circular trajectories making contact with other circles — I’ve done it properly and given the analytic answer. But it’s been two days of deeply tedious work where I could be distracted by anything.

Here are some of my trial experiments. The yellow line is the cutting path P with its red and orange material removal areas. The black lines are repeated instances of the path Q — specifically an arc segment — that I am testing the function against. I have tried them both in the clockwise and anti-clockwise directions and plotted in green the point and disk of contact with the material.

spacer

It’s Thursday evening, and now it’s done at last. I’ve got all of Friday to play with other things, annoy other people in the company with emails, and track information down in EU documents.

That reminds me: I haven’t got an answer back from technical support after they forced me to change my password after my first 90 days with the company. I asked if they had any idea what proportion of employees have to write their passwords down because of this policy — even though they are instructed not to. In other words, are they wilfully ignorant of the anti-security consequences of their supposedly pro-security policies?

Tuesday, January 8th, 2013 at 1:49 pm - Julian - Vero 4 Comments »

Battery Ventures hoovers up WorkNC

The press release says Vero Software Acquires Sescoi International, but that ain’t the truth. It would be more accurate to say that BV Acquisitions S.à.r.l. (a Luxembourg shell company that probably exists for tax reasons) acquired the company that makes WorkNC, because then people won’t be mislead as to the actual forces behind it. I don’t have any experience in tracking down company data in France, but there are some details of the business here.

I’ve got a lot of historical interest in WorkNC, because in many ways it was the software which started me off. It was sometime in about 1994 where two of us programmers in NCGraphics were driven over to Depo, a factory making these new tungsten carbide insert tools in an industrial estate in northern Germany, and sat in front of a copy of WorkNC and told to make something at least as good as that, but which was designed for running their toroidal depo tools. We travelled to Germany a few more times to closely inspect the software on the strategies it was outperforming ours on (according to the Depo engineers), as well as to learn more about machining strategies. And that’s how we got Machining Strategist off the ground.

Battery Ventures have updated their website to explain their strategy:

From years steeped in Software, the Battery team knew that mature and fragmented markets offered great opportunities for consolidation, and Europe was no exception.

Working from a successful playbook of midmarket software buyouts in the US and Canada, the team set its sites on key European markets, looking for the right situation in which to build a platform

After 9 months of intense research, team focused on the $1B CAM Software market and the universe of companies in that sector, until the one with the right fundamentals was in sight: Vero Software – a market-leading CAD/CAM company with great products, a recurring revenue base and happy customers.

The team recruited Richard Smith as an Executive in Residence, believing his 20 years of experience in the European software markets would help them to successfully diligence the opportunity and ultimately create a powerful platform to consolidate the fragmented market. Richard worked alongside the Battery team to evaluate the company and market opportunity, and build the right strategic plan for a dominant CAD/CAM vendor.

After 15 months of hard work, Battery finalized the take-private of Vero Software and appointed Richard as CEO of the newly private company.

Vero subsequently acquired Planit Software, another UK-based CAM software vendor, roughly tripling the size of the business with very little product overlap [really? --JT], creating the largest independent vendor in the market.

Executive in Residence, eh? Is that like an Artist in residence? What the heck is that all about?

Who knows what it’s like in there day-to-day. The point is to take advantage of the opportunities that flow from a set of businesses that are now under a single management where the workers are allowed to cooperate, and no longer have to interact inefficiently on the open market.

As far as I can tell, there are three strands of consolidation:

A) Consolidation of customers. By reducing competition the customers can no longer shop around and drive better prices and services.

B) Consolidation of financial engineering. While a company like Starbucks has the skills and resources to afford the costs of arranging to pay no tax while obtaining corporate welfare, most smaller companies don’t have this knowhow (they are too busy running their business). It is without doubt that Battery Ventures, whose core competency is finance, has the skills to avoid taxes that the rest of us pay to maintain the quality of civilization their associates have come to depend on. Freeloaders.

C) Consolidation of software technology. This is by identifying the best technologies across all the products and porting them from one to another so that all of the products are improved with very little cost. If Intel took over AMD in 1994 then they would explain how we would get better floating point units burnt into the CPU silicon of the Pentiums.

Clearly, only consolidation of type (C) is beneficial to the customers. So why don’t they demand it? We don’t even get much lip service in that area. In another world where people actually knew what the best software was in the same way that they know what the best whisky is, the press release would go like this:

For many years, Machining Strategist has been seen as having the best offset area clearing algorithms in the world. However its rest area detection scores six points below the quality of WorkNC’s routine on the bug index. We propose to move these functions across to the relevant products for a release to customers by Easter and have put our chief programmer Mr Gnu in charge of the operation. He will be supported by a team of temporary consultants Software Merge Services who are proven experts in the field of algorithm salvage and code quality assessments, with their focus on test driven re-development.

SMS was founded by Hewlett Packard in 2013 with the specific task of finding the one line of source code developed by Autonomy that had any positive end-user value. Here it is:

sys.exit(1)

Instead, we get this computer generated abstract waffle:

“We are extremely pleased to be joining the Vero Group. Since originally founding the company in 1987, Sescoi has become one of the world’s key CAD/CAM providers with WorkNC. However, with Vero’s global distribution, additional development resources and proven technology sharing concept, I am certain the products will advance at an even faster pace and continue to provide innovative solutions that boost productivity, bolster competitiveness, reduce costs and improve quality.”

This is just not good enough!

Monday, December 17th, 2012 at 4:54 pm - Julian - Adaptive, Vero 4 Comments »

Another day another “revolutionary” new roughing strategy

Does this look familiar?

spacer

This one from EdgeCam got under the radar.

It seems it was released last month, or maybe earlier in the year.

It does not appear to have implemented retract steps yet — a feature we had from the start in our original 2004 Adaptive Clearing development — but the pitch of the initial clearing spiral is variable, which shows it’s on the right track.

At some point we’ll have enough of these “unique” “revolutionary” cutting strategies for an independent agency to really help us out by properly benchmarking them against one another and publishing the results. That way everyone would know where their weaknesses lay and what to focus development on.

As it is now, with all the different software companies building their own implementations of this fluke cutting technique, and falsely marketing them as though nothing like it exists anywhere else in the world, we are experiencing an extraordinary amount of wasted energy.

The waste is in the form of machinists waiting for unnecessarily inefficient and buggy implementations to complete the calculations, because the developers don’t get the crucial feedback they need to make cheap and substantial improvements in the software, and in the form of developers unwittingly working on areas of the code that are have no benefit to the end user.

That’s quite aside to the unbelievable waste by certain companies sinking their finite resources into pointless patents rather than improving their code.

The Adaptive Clearing is on my mind because we have been spending what feels like months working on toolpath reordering and the multicore version of the algorithm.

The reordering algorithm was worked out on the long train back from Copenhagen, and it’s quite simple. I’ll write it up at some point. And the multicore is something that Anthony wants. He’s promised to make one of his nifty videos where he demonstrates the algorithm using a sit-on lawnmower if we ever get it finished. That’s what’s keeping me motivated when I am losing the will to push on after one too many of these queues of queuing threads hangs and everything is completely broken for a couple of days. What a great idea: take an algorithm that’s already too complicated and make it four times more complex. I’m sure it will work out. Maybe.

Tuesday, November 20th, 2012 at 5:22 pm - Julian - Adaptive

Further Surfware patent disclosure

I had reason to consult some early and embarrassing 2005/2006 entries in this blog, specifically Patent Schmatent, Patent part deux, and Nothing happened.

These concerned software patent 7451013 applied for by Surfware which had just been published while I was at Euromold that year. After running around all excited for a few hours about how this theoretical threat had manifested into something real, I got told by one of the red shirted folks on the Esprit stand (to whom we were attempting to hawk our Adaptive Clearing algorithm) that there was a way to submit our evidence of prior art to the US Patent Office before the patent was granted.

We looked into this, but it seemed to involve a submission on just the right grade of paper, plus $150 in the appropriate format for the patent office to accept — no doubt some kind of hard-to-obtain currency bank bond that would have cost us $500 in the way of these things. For details, see CFR 1.99 Third-party submission in published application.

However, there was another way. Under CFR 1.555 Information material to patentability in ex parte reexamination and inter partes reexamination proceedings:

Each individual associated with the patent owner in a reexamination proceeding has a duty of candor and good faith in dealing with the Office, which includes a duty to disclose to the Office all information known to that individual to be material to patentability in a reexamination proceeding.

So we mailed a CD with some videos to their lawyers Akin Gump Strauss Hauer & Feld LLP.

…And didn’t hear anything back.

Not being a professional patent wrangler (my job is to write code that does something, not push papers around containing useless gibberish for the purpose of preventing other people from writing code), I didn’t know of portal.uspto.gov where you could see all the documents relating to the patent process.

spacer

And in among all these documents I found the following scan:

spacer

Hint: Maybe if you read exhibit (B), the Letter accompanying the CD, then the author and dates of publication of the files in exhibit (A) would not have been unknown. Dipsticks!

Just when you thought your opinion of patent attorneys could sink no lower.

Update: Filed under the It’s Worse Than You Thought department, 35 U.S.C. 122(c) states:

(c) Protest and Pre-Issuance Opposition. The Director shall establish appropriate procedures to ensure that no protest or other form of pre-issuance opposition to the grant of a patent on an application may be initiated after publication of the application without the express written consent of the applicant.

The regulatory explanation is set forth like so:

The American Inventors Protection Act of 1999 (AIPA) contained
a number of changes to title 35 of the United States Code…

The USPTO interprets the provisions of 35 U.S.C. 122(c) as
requiring, rather than simply empowering, the USPTO to ensure that no
protest or other form of pre-issuance opposition to an application may
be initiated after its publication without the express written consent
of the applicant…

Following enactment of 35 U.S.C. 122(c), the USPTO revised 37 CFR 1.291 and 1.292 to prohibit third parties from submitting any protest or initiating any public use proceedings… To balance the mandate of 35 U.S.C. 122(c) that the USPTO establish “appropriate procedures” to ensure that third parties may not initiate protest or other pre-issuance opposition to an application after its publication… the USPTO promulgated 37 CFR 1.99 to permit third parties to submit patents and publications (i.e., prior art documents that are public information and which the USPTO would discover on its own with an ideal prior art search) during a limited period after publication of an application. However, 37 CFR 1.99 prohibits third parties from submitting any explanation of the patents or publications, or submitting any other information…

The USPTO considers any third-party inquiry or submission that is not provided for in 37 CFR 1.99 in a published application in which the applicant has not provided an express written consent to protest or pre-issuance opposition to be inappropriate…

I always did wonder why the avenue for challenging a patent application was so awesomely crappy.

Two things:

(1) It looks like we did exactly as much as we could, having found a loophole in this deliberate fortress of unaccountability.

(2) Anyone hoping that the legislative branch is going to clamp down on the patent excesses promulgated by an over-active judicial branch should be disappointed.

Saturday, November 3rd, 2012 at 9:14 pm - Julian - Machining 1 Comment »

The fate of the Autodesk sponsored THREE-D Act

A few weeks ago, while doing some background homework on this here Autodesk company that appears to have bought out my life, my work and my life’s work, I did some homework and uncovered this 2009-05-10 article:

Autodesk CEO Carl Bass said in a recent interview that he wants to see the government “mandate” the use of 3-D technology to prevent mistakes, reduce waste and achieve the best results.

…[Autodesk] sees a potential windfall in such requirements and has a lobbyist in Washington, D.C., as well as advocates talking to state officials who oversee how federal funds will be used.

I approve of the idea of forcing the administration of the built landscape to get beyond the paper and pencil age and into the era of proper digitization, so we could browse live maps on the internet that accurately represented all the current infrastructure — and all proposed plans of changes submitted by developers to that landscape.

Such a map could lead towards cracking the much more severe issue of the screwed-up legal infrastructure — Who owns this piece of land that I am standing on? — for the law is totally tangled and obscured and cannot be wrangled by the simple application of a shovel. It is far easier to dig a ten foot hole to find out whether an electricity cable passes underneath than to determin who controls the corporate entity that owns the corporate being that has shares in the commercial organization that leases a particular parcel of land.

(more…)

Wednesday, October 31st, 2012 at 12:07 am - Julian - Machining 1 Comment »

Line intersecting a finite cylinder

This is part one of a two part posting that deals with one component of the 5-axis drop cutter function. (It’s also the programming problem given to me when I applied for a job at Parasolid in about 1995.)

There’s a big problem with the finite cylinder because you need to deal with 3 separate surfaces.
As usual with mathematics, there’s a lot of setting up for the equations.

Point in cylinder

Consider the cylinder C of radius r whose axis goes from a to b, and the point p.

spacer

Let:

  v = b - a
  vsq = v . v
  m = (p - a) . v / vsq
      ==> (a + v m - p) . v == 0  [the vectors are perpendicular]
  d = | a + v m - p |

Then p is inside the cylinder when

  d <= r  and
  0 <= m <= 1
Line intersect with cylinder

Now take the line L that goes from p to q. Which values of s will the interpolated point (p (1 – s) + q s) on L be inside the cylinder C?

Because the cylinder is convex, the intersection is either empty or defined by the interval
s0 <= s <= s1

spacer

Without loss of generality, assume that a = 0, (so that b=v) and the line L is parallel to the Z-axis, so that points on it are of the form

  p = (qx, qy, z) = q + (0, 0, z)

We want to calculate the val

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.