Welcome to Modeling | Modeling News | Modeling Videos | Latest Modeling Trends


Monday, July 30, 2007

Software facilitates urban drainage systems modeling

Designed to manage sanitary, storm, and combined sewer systems, H2OMAP SWMM Suite includes DWF Allocator, which automatically computes, tracks, and assigns dry weather flows generated by various customer categories. Using genetic algorithms optimization technology, Calibrator module automatically adjusts sewer and drainage system parameters to match collected flow, depth, and velocity measurements for single event and long term continuous dynamic simulations.

New State-of-the-Art Extensions Herald Significant Leaps in Urban Drainage Systems Modeling

Broomfield, Colorado USA, March 15, 2006 - Underscoring its ongoing commitment to delivering pioneering technology that raises the bar for water and wastewater modeling, leading global environmental and water resources applications software provider MWH Soft today announced the worldwide availability of H2OMAP SWMM Suite. The comprehensive release expands users' power and ease of use in managing sanitary, storm and combined sewer systems, while state-of-the-art DWF Allocator and Calibrator extensions enable wastewater engineers and GIS professionals to slash modeling time and costs, minimize modeling errors, and significantly increase productivity.

Supporting native GIS data as well as the USEPA's industry-standard (and FEMA-approved) Storm Water Management Model (SWMM5), H2OMAP SWMM Suite integrates the needs of both GIS and wastewater engineering professionals in one complete, affordable package. Important cutting-edge features and a wide array of enhancements greatly simplify and accelerate urban drainage network engineering, helping wastewater engineers develop better designs and operational improvements faster and more efficiently, helping to shape the future of this critical sector.

The new Suite's most significant strengths include a DWF Allocator module that delivers unprecedented speed, accuracy, and flexibility in calculating, distributing, and managing dry weather flows in sewer network models. DWF Allocator automatically and reliably computes, tracks and assigns dry weather flows generated by various customer categories, based on land use and development characteristics, population, sanitary service areas, parcel data, or meter data - considering existing sewer system conditions and various planning horizons.

Another highlight is the Calibrator module, which makes it possible to reliably and swiftly calibrate very large and complex urban rainfall-runoff models, supporting better analysis and design. Using advanced genetic algorithms optimization technology, Calibrator automatically adjusts sewer and drainage system parameters to accurately match collected flow, depth, and velocity measurements for both single event and long term continuous dynamic simulations. These parameters can include any combination of subcatchment, soil, aquifer, RDII, and conduit properties.

"H2OMAP SWMM Suite was developed with end users in mind, and nearly all its innovations and enhancements were direct results of user specific requests and suggestions," said Trent Schade, PE, Senior Client Service Manager and National Stormwater Technology Leader for MWH Soft. "We spend hundreds of hours with our customers to learn how we can improve our products to help them achieve their engineering goals. Our new H2OMAP SWMM Suite redefines the boundaries of innovation in the wastewater modeling industry by providing new, powerful tools that make urban drainage modeling easy, fast, and fun."

"Speed, ease of use, flexibility, and power are the words that best describe H2OMAP SWMM Suite," said Paul F. Boulos, Ph.D, President and COO of MWH Soft. "Users will be amazed at how efficiently the new suite extensions can automate common, tedious modeling tasks, enabling them to quickly generate the credible models they need to maximize system performance at minimum capital investment. We will continue to develop vital new technologies that improve the modeling process, so that someday, developing a drainage model will be as easy and fast as turning on a light."

CAD Software offers tools for 2D drafting and 3D modeling

TurboCAD Professional v.12 features adjustable tool palettes and part-tree that allows for parametric modeling of mechanical and architectural designs. Context sensitive tools eliminate unnecessary prompts and options, and users can apply material properties to individual facets of same object and preview blends. Horizontal/vertical constraints can be defined, and parametric doors/windows can be created and dropped directly into walls of 3D drawing.

********************

Enhancements Include New Mechanical and Architectural Features, Improved Modeling, and Greater Interoperability

NOVATO, Calif., March 23 /-- IMSI[R] (BULLETIN BOARD: IMSI) the leader in affordable design software, today announced that the newest version of its flagship product line, TurboCAD Professional v.12, is now available for purchase.

"TurboCAD v.12 is definitely our best version yet," said Bob Mayer, Executive Vice President of Precision Design. "Improved interoperability, enhanced parametric architectural tools, and improved 3D modeling make this a highly desirable solution for professional designers. We are especially proud of the revamped Part Tree, which now effectively makes TurboCAD Professional a fully parametric design tool for the mechanical design market," added Mayer.

TurboCAD Professional v.12 delivers the most significant improvements ever packed into a TurboCAD upgrade. New and improved tools for 2D drafting and 3D modeling deliver new possibilities in mechanical and architectural design. Improvements in rendering and the new support of LightWorks[R] Archives result in even better visualization capabilities. And new, adjustable tool palettes, context sensitive tools, and intuitive work flow combine to increase productivity.

"We are delighted that this new release of TurboCAD has taken advantage of the latest LightWorks functionality," commented Clive Davies, Partner Programme Manager at LightWork Design. He added, "In particular, the ability to use LightWorks Archive materials (LWA) will allow TurboCAD users to access the growing number of high quality digital materials in the LWA format. Our experience has been that this will help users to render better images even faster."

What's New and Improved

o Redesigned "part-tree" for quick, easy parametric modeling and editing

o New context sensitive tools that eliminate unnecessary prompts and

options

o Improved fully customizable user interface with innumerable user

preferences

o New horizontal and vertical constraints, improved parallel plane

constraint

o Improved properties palette for editing using data entry and menu

options

o New ability to create parametric doors and windows and to drag and drop

them directly into the walls of a 3D drawing

o LightWorks[R] 7.5 rendering engine with LightWorks Archive support

o Ability to apply material properties to individual facets of the same

object and preview blends before committing to them.

o New Point Marker tool allows automatic numbering of objects such as

rooms, doors or windows

o Updated ACIS[R] v15 solid modeling engine

o "Flicker free" rendered views while moving, rotating and zooming

o "Pre-cache" rendering for faster 3D previews

o Simplified, on screen, text editing of 2D and 3D text

o Improved rigid and normal sweeps, including simple extrudes

o Improved materials editor for creating custom, photo-realistic

materials and surfaces

Enhanced Mechanical Design

TurboCAD Professional v.12 now lets the user apply material properties to individual facets of the same object and preview blends before committing to them. More geometric constraints have been added to improve the relationship between 2D constraints and the resulting 3D derived designs. And a rebuilt Part Tree now gives a more complete view of part's history and parameters. Additional mechanical enhancements include:

o New: ability to define horizontal and vertical constraints

o New: ability to blend previews and one click selection for edge

blending

o Improved: redesigned "part-tree" for quick, easy parametric modeling

and editing

o Improved: ability to create automatic and precise tangencies where

splines meet other objects

o Improved: ability to define constraints for objects in parallel planes

with just one click.

o Improved: ability to apply different materials to different facets of

the same object

Enhancements in Architectural Drafting

TurboCAD Professional v.12 now includes the ability to create parametric doors and windows and to drag and drop them directly into the walls of a 3D drawing. It also allows the selection of the specific properties of each object, including frames, hinges and knobs. TurboCAD will even automatically generate the appropriate symbol for the 2D version of the drawing. Further architectural enhancements include:

o New: Point Marker tool allows the automatic numbering of objects such

as rooms, doors or windows and is great for creating legends, call

Monday, July 23, 2007

Software enables complete building information modeling

Enabling users to create, manage, and share design information, Autodesk Revit Systems are designed for mechanical/electrical/plumbing (MEP) engineering. Autodesk AutoCAD Revit Series is offered combined with three solutions: Revit Building 9 for architects and designers, with Revit Structure 3 for structural engineering and drafting, and Autodesk Revit Systems to facilitate user migration to and utilization of building information modeling (BIM).

********************

New Revit Systems for Mechanical, Electrical and Plumbing Engineering Completes Building Information Model for Building Design; New Revit Series Products Provide Easy Road to BIM Adoption

SAN RAFAEL, Calif., March 23 / -- Autodesk, Inc. (NASDAQ:ADSK) today made it faster and easier for architects, structural engineers and building systems engineers to realize their ideas and embrace the benefits of building information modeling (BIM). With the availability of Autodesk Revit Systems, a Revit-based BIM solution for mechanical/electrical/ plumbing (MEP) engineering, Autodesk has completed the building information model and now offers discipline-specific BIM solutions on the Revit platform across the entire building design enterprise. Autodesk Revit Systems will be available as part of Autodesk AutoCAD Revit Series -- Systems Plus, one of three new Revit Series products that combine AutoCAD 2007 software with the Revit family of software products to help customers move to BIM at their own pace. Revit is now the most comprehensive platform for BIM and allows customers to create, manage and share design information more effectively, contributing to increased profitability, reduced risk and fewer inefficiencies in building design, construction and management.

"Working on the Freedom Tower project with a complete building information model on the Revit platform is enhancing collaboration with the entire building design team including the architects and structural engineers," said Scott Frank, partner, Jaros, Baum & Bolles. "Adopting Revit Systems and moving to BIM is helping JB&B to improve spatial coordination and integration of the MEP systems."

New Revit Series Provide Road to BIM

Many Autodesk building industry customers are moving from CAD drafting software to building information modeling. Autodesk is releasing new Revit series products to help these customers gain the competitive advantages of BIM, preserve their current investments and provide the flexibility to move to building information modeling at their own pace. Autodesk AutoCAD Revit Series -- Building 9, for architects and designers, and Autodesk AutoCAD Revit Series -- Structure 3, for structural engineering and drafting, combine Autodesk's industry-leading AutoCAD 2007 software with Revit Building 9 and Revit Structure 3 respectively. Autodesk AutoCAD Revit Series -- Systems Plus combines Autodesk Building Systems 2007 with Autodesk Revit Systems into one comprehensive discipline-specific design solution to make it easier than ever for customers to experience the full benefits of BIM.

"Autodesk is bringing the power of BIM to new disciplines, and now that we've completed the building information model on the Revit platform, helping drive greater efficiency, productivity and collaboration in the building industry," said Jay Bhatt, Vice President, Autodesk Building Solutions Division.

Revit Systems Completes Building Information Model for Design

The Autodesk Revit platform is the best and most complete design solution for building information modeling in the industry today. BIM is the creation and use of coordinated, consistent, computable information about a building project in design that yields reliable digital representations of the building -- representations used for design decision-making, production of high-quality construction documents, performance predictions, cost-estimating and construction planning and, eventually, for managing and operating the facility. By working together on an integrated building information model, the various firms involved in the design, construction and management of buildings can greatly increase efficiency and significantly reduce coordination errors. Real-time, consistent relationships between digital design data -- with innovative parametric building modeling technology -- provide significant advantages over traditional methods of design and construction.

Autodesk Revit Systems, an intuitive design and documentation tool that works the way engineers think and brings the power of BIM to MEP engineering, joins Autodesk Revit Building for architects and designers, and Autodesk Revit Structure for structural engineering to complete the Revit platform for BIM. Through data-driven system sizing and design, Revit Systems helps minimize coordination errors between engineering design teams, as well as with architects and structural engineers within Revit-based workflows. Through the automated creation of engineering design data and enhanced client communications, decisions can be made faster and more accurately. Firms using Revit Systems can collaborate seamlessly using building models developed in Autodesk Revit Building or Autodesk Revit Structure software. Features in the first release of Revit Systems include:

Structural Analysis of an Echinococcus granulosus Actin-Fragmenting Protein by Small-Angle X-Ray Scattering Studies and Molecular Modeling

The Echinococcus granulosus actin filament-fragmenting protein (EgAFFP) is a three domain member of the gelsolin family of proteins, which is antigenic to human hosts. These proteins, formed by three or six conserved domains, are involved in the dynamic rearrangements of the cytoskeleton, being responsible for severing and capping actin filaments and promoting nucleation of actin monomers. Various structures of six domain gelsolin-related proteins have been investigated, but little information on the structure of three domain members is available. In this work, the solution structure of the three domain EgAFFP has been investigated through small-angle x-ray scattering (SAXS) studies. EgAFFP exhibits an elongated molecular shape. The radius of gyration and the maximum dimension obtained by SAXS were, respectively, 2.52 ± 0.01 nm and 8.00 ± 1.00 nm, both in the absence and presence of Ca^sup 2+^. Two different molecular homology models were built for EgAFFP, but only one was validated through SAXS studies. The predicted structure for EgAFFP consists of three repeats of a central β-sheet sandwiched between one short and one long α-helix. Possible implications of the structure of EgAFFP upon actin binding are discussed.

The larval stage of the cestode tapeworm Echinococcus granulosus is the causative agent of cystic hydatid disease or hydatidosis, recognized as one of the world's major zoonoses (1). This parasite requires two mammalian hosts for completion of its life cycle. Adult tapeworms develop in the small intestine of definitive hosts (domestic dogs and wild canids), whereas the metacestode or hydatid cyst usually develops in the liver or lungs of intermediate hosts (mainly in ungulates, and accidentally in humans). The pathological effect of the disease is caused by the pressure exerted by the hydatid cyst on the intermediate host's viscera. Within the cyst, protoscoleces are produced by asexual reproduction and develop into the adult worm when ingested by the definitive host.

The E. granulosus complex life cycle involves important changes in cell morphology and physiology (2). The molecular and cellular mechanisms involved in E. granulosus development are still largely unknown but are likely to require extensive cytoskeleton reorganization (3,4).

The actin cytoskeleton is a vital component of several key cellular and developmental processes in eukaryotes, such as motility, cytokinesis, cytoplasmic organization, and endocytosis (5). In cells, the assembly and disassembly of actin filaments, in addition to their organization into functional three-dimensional (3D) networks, are regulated by a variety of actin-binding proteins (6-10). Among these proteins, those from the gelsolin superfamily control actin organization by severing filaments, capping filament ends, and nucleating actin assembly (11).

The best-studied members of this protein family are severin (12-14) and fragmin (15,16) from Dictyostelium discoideum and Physarum polycephalum, respectively, and gelsolin (17,18) and villin (19,20) from higher organisms. A common feature of this family is the segmentai organization into three (severin, fragmin) or six (gelsolin, villin) homologous domains that might have evolved from an ancestral one domain protein through a stepwise process, involving a gene triplication followed by an additional duplication event (21,17). The activities of these proteins are often modulated by signaling molecules, such as Ca^sup 2+^ or phosphorylated phosphoinositides (22). Based on gelsolin (23,24), the most extensively studied member of the family, it is generally accepted that the second domain binds F-actin, whereas the first domain (and the fourth one, for six domain proteins) binds G-actin.

So far, only a few proteins of the gelsolin superfamily have solved 3D structures. A search in the database of protein structures indicates that of all known members of this family to date, gelsolin is the only protein that has its fulllength (six domains) structure solved (25). Structures of other proteins, like villin (26) and severin (27,28), have been determined only for the first or second domains, usually bound to ligands (Ca^sup 2+^ and/or actin). Domain comparisons between the known structures show that they all share a common fold built around a central five-stranded mixed β-sheet, which is flanked by a long α-helix running parallel to the sheet and a short perpendicular running α-helix (25-29).

Our laboratory has previously cloned and functionally characterized a 42 kDa actin filament-fragmenting protein from E. granulosus (EgAFFP) (30). The recombinant EgAFFP protein is recognized by sera of ~69% of human hydatid disease patients (31) and, in vitro, was able to induce actin polymerization and sever actin filaments, confirming that it belongs to the gelsolin superfamily (30). According to sequence analysis, EgAFFP presents three repeated domains and is similar (36% identity) to the gelsolin NH^sub 2^-terminal half (G1-G3). The lack of structural data for full-length three domain members of the gelsolin superfamily, such as EgAFFP, represents an obstacle to the understanding of structure-function relationships of these smaller proteins, which are functionally equivalent to their six domain counterparts. Structural characterization of EgAFFP might help to understand how three domain members function and how they are regulated by calcium.

Thursday, July 19, 2007

Characterization of the Structure of RAMP1 by Mutagenesis and Molecular Modeling

Receptor activity modifying proteins (RAMPs) are a family of single-pass transmembrane proteins that dimerize with G-protein-coupled receptors. They may alter the ligand recognition properties of the receptors (particularly for the calcitonin receptor-like receptor, CLR). Very little structural information is available about RAMPs. Here, an ab initio model has been generated for the extracellular domain of RAMP1. The disulfide bond arrangement (Cys^sup 27^-Cys^sup 82^, Cys^sup 40^-Cys^sup 72^, and Cys^sup 57^-Cys^sup 104^) was determined by site-directed mutagenesis. The secondary structure (α-helices from residues 29-51, 60-80, and 87-100) was established from a consensus of predictive routines. Using these constraints, an assemblage of 25,000 structures was constructed and these were ranked using an all-atom statistical potential. The best 1000 conformations were energy minimized. The lowest scoring model was refined by molecular dynamics simulation. To validate our strategy, the same methods were applied to three proteins of known structure; PDB:1HP8, PDB:1 V54 chain H (residues 21-85), and PDB:1T0P. When compared to the crystal structures, the models had root mean-square deviations of 3.8 [Angstrom], 4.1 [Angstrom], and 4.0 [Angstrom], respectively. The model of RAMP1 suggested that Phe^sup 93^, Tyr^sup 100^, and Phe^sup 101^ form a binding interface for CLR, whereas Trp^sup 74^ and Phe^sup 92^ may interact with ligands that bind to the CLR/RAMP1 heterodimer.

G-protein-coupled receptors (GPCRs) represent one of the largest protein families within the human genome. They have a characteristic architecture, consisting of seven transmembrane (TM) helices. Ligands hind to the extracellular face of the receptor or to a pocket formed within the TM region. In contrast, G-proteins hind to the intracellular face of the receptor.

Until recently, GPCRs were considered to act essentially as monomers. However, there is now considerable evidence that many form dimers or other oligomers (1). Most attention has been focused on dimers between GPCRs, but other proteins can also be involved. These include the family of receptor activity modifying proteins (RAMPs). These were first identified as partners for the calcitonin receptor-like receptor (CLR). CLR by itself is unable to bind any ligand; however, in the presence of RAMP1 it functions as a receptor for calcitonin gene-related peptide (CGRP). whereas in the presence of RAMP2 it becomes an adrenomedullin receptor. The CLR/RAMP3 complex also preferentially binds AM, but it has a greater affinity for CGRP than CLR/ RAMP2 (2). Subsequently it has been shown that RAMPs can associate with a number of other receptors, including the calcitonin, parathyroid hormone 1 and 2, vasoactive intestinal peptide/pituitary adenylate cyclase activating polypeptide (VPAC^sub 1^, VPAC^sub 2^), glucagons, and calcium-sensing GPCRs (3-5).

All three RAMPs are thought to he built around a common architecture (2,6) (Fig. 1). They have a short, intracellular C-terminus followed by a single TM region. The largest pail of the protein is the extracellular domain: ~90 amino acids for RAMP1 and RAMP3, whereas for human RAMP2 this domain is 13 residues longer. All RAMPs have four conserved cysteine residues; RAMP1 and RAMP3 have an additional pair.

It seems that the N-terminus is the major determinant of ligand binding (7.8). The structure-function relationship for RAMP2 and RAMP3 have been investigated by use of protein chimeras; these have identified residues 86-92 of human RAMP2 and 59-65 of human RAMP3 as key epitopes for AM binding (9). Deletion analysis of human RAMP3 suggested that residues 91-103 formed an important epitope for CGRP binding (10). In human RAMP1, Trp^sup 74^ is important for high-affinity binding of BIBN4096BS, a nonpeptide antagonist of CGRP; the mutation W74K substantially reduced antagonist affinity (11). There is no structural explanation for the effect of any of these mutants, and it is unclear whether the residues or epitopes make direct contact with the ligands or act indirectly to stabilize ligand binding sites. In addition, the cysteines in the N-terminus probably form disulfide bonds. Although some information has been obtained from previous studies (12), to date, there has not been any systematic mutagenesis study of their topology.

In this study we have produced mutant RAMP1 constructs which incorporate all possible pairwise combinations of Cys to Ala mutants, to determine the organization of the disullide bond network in hRAMP1. In addition, we have produced an ab initio molecular model of RAMP1 which is entirely consistent with the mutagenesis data presented in this study and also provides a mechanistic basis for mutagenesis data previously published by other laboratories.

Modeling Starburst Cells' GABA^sub B^ Receptors and Their Putative Role in Motion Sensitivity

Neal and Cunningham (Neal, M. J., and J. R. Cunningham. 1995. J. Physiol. (Lond.). 482:363-372) showed that GABA^sub B^ agonists and glycinergic antagonists enhance the light-evoked release of retinal acetylcholine. They proposed that glycinergic cells inhibit the cholinergic Starburst amacrine cells and are in turn inhibited by GABA through GABA^sub B^ receptors. However, as recently shown, glycinergic cells do not appear to have GABA^sub B^ receptors. In contrast, the Starburst amacrine cell has GABA^sub B^ receptors in a subpopulation of its varicosities. We thus propose an alternate model in which GABA^sub B^-receptor activation reduces the release of ACh from some dendritic compartments onto a glycinergic cell, which then feeds back and inhibits the Starburst cell. In this model, the GABA necessary to make these receptors active comes from the Starburst cell itself, making them autoreceptors. Computer simulations of this model show that it accounts quantitatively for the Neal and Cunningham data. We also argue that GABA^sub B^ receptors could work to increase the sensitivity to motion over other stimuli.

One of the most important unanswered questions in retinal neurohiology is why the Starhurst eholinergic amacrine cells have two neurotransmitters (1-4). These cells produce both acctylcholinc (ACh) and γ-aminohutyric acid (GABA), releasing them upon light stimulation (5-9). The released ACh has many roles in the retina, including the enhancement of motion sensitivity (10-12) and the establishment of directional selectivity for some types of stimuli (13-20). In turn, recent evidence shows that GABA is the main transmitter involved in directional selectivity (6,7,21). In this article, we discuss another possible role suggested by the eholinergic amacrine cells themselves containing GABA^sub B^ receptors (22). Because these receptors often work as autoreceptors in the brain (23-25), this raises the possibility that GABA from these cells feeds back onto them to control their release of ACh. This control could he by hyperpolarization (26,27), by reducing a Ca^sup 2+^-dependent current (28-30) through a G-protein mechanism (30-32), or by facilitating a L-type Ca^sup 2+^ channel (33). The results of Neal and Cunningham (5) coupled to the results of Zucker et al. (22) lend some support to such an autorcceptor control (see also (34)). Neal and Cunningham showed that the GABA^sub B^ agonist baclofen and the glycinergic antagonist strychnine enhance the light-evoked release of retinal ACh. Considering these results, they proposed that glycinergic cells inhibit the Starburst cells and are in turn inhibited by GABA through GABA^sub B^ receptors. However, as shown by Zucker et al. (22), glycinergic cells do not appear to have GABA^sub B^ receptors. Consequently, one must search an alternate hypothesis for the role of these receptors. The simplest alternative given the available data is that GABA^sub B^ agonists enhance the release of ACh by acting on the Starburst cells themselves. These cells may synapse onto glyeinergic cells (which probably include the cholinoreceptive DAPI-3 cell (22,35,36) through muscarinic receptors. (The receptors may be chemically ephaptic, that is, ACh may diffuse to targets far from the presynaptic site; this would help to explain the apparent dearth of conventional synapses made by Starburst cells onto noncholinergic amacrine cells (37)). In addition, retinal cholinergic receptors are often far away from the site of cholinergic release (38)-the glyeinergic cells of ACh may provide contact back onto Starbursl cells (5.39). Hence, the activation of GABA^sub B^ receptors may result in disinhibition.

In this article, we use a biophysical model to test the feasibility of the GABA^sub B^-auloreceptor hypothesis for Starburst cells. To know whether this hypothesis will work is not so easy. One difficulty is to know how GABA^sub B^ agonists reduce the muscarinic input to the glyeinergic cell at the same time that they increase the overall release of ACh. Perhaps the answer lies in the recent surprising finding that only ~257(of Starburst-cell varicosities contain GABA^sub B^ receptors (22). If the input to glyeinergic cells came only from these varicosities. then GABA^sub B^ agonists might affect these cells without reducing ACh release from other varicosities. However, the model must solve another problem with GABAergic action on Starburst cells. The release of ACh from Starburst cells may be also inhibited by GABA through GABA^sub A^ receptors (8,16,39,40). How is it that the GABA that putatively feeds back to the GABA^sub B^ autoreceptordoes not inhibit the ACh release through the GABA^sub A^ hcteroreceptor? The model provides answers to these questions and lits the Neal and Cunningham data well. An abstract version of the model appeared elsewhere (41).

The next section of this article will provide the model assumptions and their justifications, along with a physical description of the model. That section will include no equations to facilitate the comprehension of the ideas. The model equations and the parameters used in the simulations will appear in Appendices A and B, respectively.

Wednesday, July 18, 2007

Balancing Style and Worth with Your Home Remodeling Costs

The classic way to increase the value of your house is spending some home remodeling costs for your existing rooms or additional floor plans. But since you’re spending, why not make the most of what you can get?

You have the choice whether to add new appliances, or cabinets or fixtures; or you can go way until only some few steps to the edge if you have enough home remodeling costs. What’s important though is for you to reach the most desirable look or use you want for your home. You can start by choosing the right space to remodel.

Home remodeling costs are proven more worth it if spent on new additions in locations that are in a completely different architectural style – the kitchen and the bathroom. This way you can always protect the character of the rest of your home by keeping it within the existing framework, while transferring added attraction instead to the kitchen or bath.

And you don’t really need to buy the most expensive materials for your new additions. You can keep your home remodeling costs within the budget of only simple repairs that are made to last – a practical and clean style so to speak. But of course you can always go for some simple luxuries. You can have your bathroom enlarged or take-in a sunken whirlpool, tub, modern showers, or extension spa or dressing rooms. Same thing for the kitchen – you can simply replace broken tiles or prop-up a charming breakfast nook, a new kitchen island, or place elegant granite countertops for more drama.

In contrast, the most financially unrewarding areas putting in more home remodeling costs are those directly above-ground areas such as the basement, garage, yard, or walkway (unless you’re turning them into a completely new space). Adding swimming pools should also re-considered over and over since its cost is high enough and which should even include maintenance outlay. It may all be worth it in the end but right now, consider the cost.

You can easily remodel just about any part of your house without placing too much on your overall home remodeling costs. Replacing worn carpeting, tiles, and wood floors will give you the immediate advantage you’re looking for. You can also just update the paint colors or add new wall coverings. Also bear in mind that there are many stylish and trendy items at discount prices found at garage sales and outlet stores. Who knows, you may even be fortunate to find a superb piece of cabinetry that fits perfectly well with your kitchen’s theme. In fact, you can even recycle your stuff out of its context and into a completely fresh function. Just keep in mind that the true effectiveness of a space comes from the harmonious balance of all of its components. You may not always stay simple but be consistent. Then, even the smallest and cheapest changes in your home can make a world of difference.

Idea For Remodeling A Small Bathroom With A Towel Warmer Wall Mount

A great space saving idea for remodeling a small bathroom

When you remodel a small bathroom the most important question you have to ask yourself is "how can I make more room?" You want to get the most out of your new ceramic paradise and often that means getting as much into it as you can.

When I remodeled my small bathroom I wanted to include a decent shower. The one I was using over my bathtub just wasn't doing it for me and I had set my heart on a shower cubicle apart from the bathtub completely. My problem of course was lack of space.

When I looked at my old bathroom it was already full of fixtures and I was having a hard time trying to work out how on earth I was going to rearrange things to add a shower into the mix. The breakthrough came when my contractor made what I thought at the time was a silly suggestion but came around to believing that it could work in any bathroom including yours.

Bathroom towel warmer wall mount

My contractor suggested that I threw away the radiator that was taking up space at floor level and replace it with a nice looking stainless steel towel warmer that would heat the bathroom as well as the towels. Ok so where's the space saving?

He then suggested that I used a wall-mounted towel warmer to double as a radiator and fit it on the wall over the end of the bathtub. This would save a lot of space because the radiator would now be on a wall that would otherwise be doing nothing.

The more I thought about this suggestion the less silly it became so I went with it and you can see the result in a bathroom remodeling before and after picture here: Remodeling Pictures Of Bathrooms Before And After

Could this work in your small bathroom?

Careful small bathroom remodeling can work wonders

The wall mounted towel warmer idea worked out very well for my small bathroom and it could do the same for you. You will find that the bathtub gets used a lot less if you have a separate shower so the heated towel rail can be used for warming towels most of the time. It will heat you bathroom, be out of the way and it won't take up any of your precious floor area.

If you spend time on the design of your small bathroom remodeling project before you start any work you will be surprised at how much space you can save by experimenting with small sized toilets, small deep bathtubs, tiny wash basins and carefully shaped small shower cubicles. It's worth taking the trouble to do this at the start of your project because you will reap the benefits later.

Thinking About Remodeling? Don't Fall Into "The Money Pit"

One of my favorite movies is “The Money Pit” with Tom Hanks and Shelley Long. They play newlyweds that want to buy a house but don’t have much money. When they stumble on a beautiful old mansion that is selling for an unbelievably low price they buy what they thought was their dream house.

As soon as they move in the house starts falling apart and they have no choice but to remodel the entire house from top to bottom and the remainder of this hilarious movie is devoted to this process. From working with the indifferent contractor to putting up with no water or even stairs this movie is an extreme example of what could happen during a remodeling and watching this movie reinforces the facts that a home remodeling project can lead to high family stress levels and in extreme cases divorce.

Fortunately, this is one example where for the most part life does not imitate art and most people that are thinking of remodeling their home will not have anywhere near the experience they did. There are, however, lessons to be learned and I think it would be a wise idea to watch this movie before you commit to any major remodeling plans for your house.

As you are most likely not in a situation where a complete overhaul is required immediately you will have time to decide exactly what it is you want to accomplish. Do you want that bathroom retreat with the Jacuzzi tub? Maybe you would like to finish off the basement and have a great place for parties? Is the family outgrowing the house and you need to add another bedroom?

These are all questions you need to answer up front. Write down on paper what it is you want and sketch it out so you have something to work from. There is some great inexpensive software available that you can use on your PC to design just about anything from building a new house to adding an additional room.

Unless you plan on doing the work yourself once you have your plan you need to find a reliable contractor. This is much easier said than done and you would be well advised to talk to at least three contractors before picking one. Ask for references and please do check them out. See if you can inspect some previous work they have done and check to see if they are in good standing with the Better Business Bureau. Ask family, friends and neighbors if they can recommend someone they may have used for past projects.

Get a rough idea of what the project will cost from each contractor. If it’s more than you expected you might be able to cut corners a bit by using less expensive materials or maybe downsizing the project a bit. Be wary of any contractor that says they can do the job for an amount that is unrealistically lower than your other bids.

Now you can set a realistic budget that you must stick to. Be sure and have a little extra built in for the inevitable extras that will crop up. Make sure you get everything in writing in case you have problems during the project. Have everything spelled out as to what will happen, who will do it and when and don’t be afraid to consult with a lawyer if a large sum of money is involved.

If you need to finance the project with a second mortgage or home equity loan be sure you are not overextending yourself financially and think long and hard about whether you really need to remodel. Maybe it is something that you can put off until you have saved up a bit more money thus saving you some financing costs.

In the end, if you plan everything up front, pick a good contractor, budget wisely and get everything in writing your remodeling project should come off without a hitch and you will be very happy with the results.

Tuesday, July 17, 2007

Do You Want to Start a Mobile Home Remodeling Addition?

A mobile home remodeling addition is a great way to add more space to your home. Most mobile homes are fairly small and it is difficult to have a growing family in one. If you own the land that your home is on, you might want to consider adding an addition to your home.

You can use a computer program to design your addition or just make up some of your own designs. You can also hire an architect who is trained in creating additions for real estate that is similar to what you want your home to look like.

The Decision Process

Deciding on the interior design of your addition is often the hardest part. You will only have a limited amount of money, but you want to maximize the space that you are adding on to your home. Homes that have a good design will automatically feel more comfortable.

When you start to remodel you will also need to decide on the quality of products that you will be putting into your home. Building products come in a variety of styles and costs, so you will need to do your homework to make the right decision for your needs.

There are many different little decisions that you will have to make throughout the process of your remodel or addition. You may have to pick out cabinets, create a color palette, or even decide where electrical outlets need to go.

The Plan

It is important to have a plan when you begin your remodeling project. You will need to create a budget and repair the budget as needed throughout the building process. Creative home remodeling is often necessary to stay within your ideal budget.

Once you have come up with a reasonable budget, then you will need to start looking into contractors that can complete the work for that budgeted price. You will want to find a contractor that is good and has references from previous clients. Make sure and actually call those past clients and talk to them about their project.

After hiring a contractor you will want to stay as active in the remodeling process as you can. Stay informed of the progress and continually ask for updates. It will be easier to remedy any problems if you know about them well in advance.

Remodeling or adding onto your home is a fun and exciting project, but it can also be a bit overwhelming if you have not planned correctly. Make sure you spend the necessary time planning your project so you do not have more problems later on.

Stick to your budget as close as you can and you will avoid spending more money than you wanted to. The additional space that you add on is a great way to increase your home's value while adding comfort to your family's daily life. Now get started on planning your mobile home remodel... It is a long process ... that will be full of many ups and downs... but well worth it in the end!

Cost Of Remodeling A Home

What are the costs of remodeling a home?The answer to this question might be a bit more complicated than it seems. Many house owners certainly wish to have an immediate response so they can evaluate if they are able to remodel their home or leave it as it is. In most cases, a quick eye evaluation is not enough. It requires a walk through of a professional home architect or designer. They are people like this who have the experience and knowledge to understand what you really want and how much it will cost you.

Why It Is So Complicated To Predict The Cost Of Remodeling A Home

Every home is different. Even two houses built with the exactly same plans are different. Why, you may ask? Because they are used by different people or families and each family uses their house in the way that they find works best for them. One family could be extremely careful with their property, performing periodic maintenance continually and assuring themselves that the value of their home doesn't fall.

Other families may not be interested in doing this kind of maintenance. They may not even know that a house has to be periodically maintained, just like a car. This is a pity because some people may think of it as a cost, when it really is an investment. Especially if they consider that in some point of their life they may evaluate the possibility of selling their home.

Other factors that influence the cost of remodeling a home is the number of owners that have used the house and the quality of the materials with which the house was built.

As you can see, there are many issues that have to be taken into consideration by a professional if you want to make an estimate and not a 'guesstimate' of the real cost of remodeling a home.

Where Can You Find Someone Who Can Give You A Real Cost Estimate?

There are roughly two types of sources:

- the yellow pages and

- the world wide web

Each one of them has its own advantages. In the yellow pages case they can provide you information on your local home specialists. The disadvantage however, is that you won't know how good they are unless they can provide you with a list of previous clients. If you use the internet you can compare different type of contractors and choose the one that you prefer.

A home remodeling project takes time, often more than we expect. Therefore I will recommend that you prepare yourself for doing some research. Remodeling a home is normally an expensive affair, although it depends on what you actually are going to do, so you'd better make a good plan before you begin to spend your money.

Remodeling Your Garage

Ever thought of improving your garage or remodel it to add more space to your house? A very cost-effective way to gain more space for your house is to remodel your garage. If you no longer use your garage to park your car, you can greatly expand your home’s living space. Apart for that, remodeling your garage can help you become more organized and stop storing everything in your garage.

Garage has been traditionally been used as workshop or to park your car as it provide a perfect environment to work. Many people are converting their garage to used for a gym or for another living room designed for special activities. However, remodeling your garage does not necessarily mean that you have to sacrifice your storage space or your parking area. Garages today have become multi functional, allowing cars, general storage, a workshop and even a home office all in the same place.

Because many garages are attached to the home they can benefit from the same comfort as a home: phone wires, heating and cooling and plumbing. All you need is a little imagination, a fixed budget and a plan.

Moreover, when remodeling your garage you should consider an addition above the garage. Adding a room for work space and even for living, for one of your kids(they usually love this) above the existing garage can give you additional space and also increase the value of your home without major changes to the floor plan.

If you feel the need to remodel your garage but you don’t know exactly what you want or need, you should consider the following ideas.

1. Converting your garage into a laundry room, if you have your laundry room in the basement, will eliminate the need to go up and down the stairs.

2. A music studio can be a great choice for your garage remodeling plans if you or your kids have some tendencies in this direction. The garage is the place where many “garage bands” got their start. To avoid your neighbors’ complaints consider soundproofing your garage walls.

3. Transform your garage into a gym. If you do not have enough place to store your equipment, you can always move it into the garage and have plenty of place for your daily exercises and training.

Before you decide to remodel your garage, first consider your budget and determine how much you can allocate to spend on your garage. If you are converting your garage to a living area, you need to consider adding additional phone lines, electrical wires, heating, cooling, and plumbing,.

However if you prefer your garage to remain your workshop, you could consider the following things that need to change or improve.

1. Storage cabinets
2. Workbench
3. Drywall
4. Flooring
5. Plumbing
6. Electrical
7. Heating/Air/Ventilation
8. Countertops
9. Ceilings
10. Interior walls
11. Interior/Exterior doors
12. Laundry/Mud room
13. Bathroom

Monday, July 16, 2007

Teachers' modeling advantage and their modeling effects on college students' learning styles and occupational stereotypes: a case of collaborative tea

Hospitality has been a mainstream service industry in the 21st century. It is also designated as the target industry by the Taiwanese government. Higher education in the hospitality field in Taiwan is managed by the system of vocational education and aims to raise the level of expertise among practicing professionals. In order to maximize efficiency in training professionals, collaborative teaching has been widely adopted. The technical courses are taught by both professionals (as the technical teachers) and the academic instructors (as the lecturing teachers) to the same students simultaneously. The technical teachers are primarily responsible for the demonstration of practical skills, whereas the lecturing teachers are in charge of illustrating the principles and theories underlying those skills.

Social learning theory (Bandura, 1977, 1986) has noted that a considerable amount of learning takes place in the absence of direct reinforcement, either positive or negative, through a process called modeling or observational learning. Modeling, then, is a process through which individuals learn behaviors, attitudes, values, and beliefs by observing others and the consequences of others' actions. In an educational context, teachers are one of the important role models in students' learning processes. If students recognize teachers as role models, teachers will have an impact on what students learn through social learning.

Considering the modeling effects of teachers on students in technical courses, students' learning styles and occupational stereotypes were chosen in the present study because they are both important to learning and career development. Learning styles influence the academic achievement of college students and certain styles may be more effective for particular activities in the classroom environment (Matthews, 1991). Reading-Brown and Hayden (1989) found that, in higher education, college students' learning styles impact their choice in entering a given institution that would meet their needs. On the other hand, occupational stereotypes are important to students in terms of learning, future career development, and life decisions. Recent studies point out that the modeling effects of teachers have a considerable influence on students' occupational stereotypes (Franken, 1983; Reid, 1995; Beall & Sternberg, 1993). Fox and Renas (1977) demonstrated that students' role models had a crucial influence on their vocational roles. Tiedemann (2000) found that the beliefs of teachers were effective predictors of students' gender-role stereotypes. Thus, teachers in the collaborative teaching of technical courses may affect college students' learning styles and occupational stereotypes through their modeling behaviors.

Within the context of collaborative teaching in technical courses, two teachers engage the same students who may exhibit different learning styles, personalities, and missions. It would be interesting to learn, when the two role models are juxtaposed, which may have more impact on students' learning styles and occupational stereotypes. This study attempts to construct a concept of modeling advantage to investigate the relative advantage of competing types of instructors on students' social learning. Modeling advantage is defined as what the student observers perceive to be the advantages of the two individual role models. This can be a dominant factor in learning style as well as the career type.

In this study, we sought to investigate the modeling effects of two kinds of teachers (the technical teachers vs. the lecturing teachers) in collaborative teaching courses on college students' learning styles and occupational stereotypes.

MODELING ADVANTAGE IN COMPETING MODELS

In everyday life, people often learn by observing what others are doing. Through social learning, we do not learn directly, but rather by observing others. Bandura (1977, 1986) and his colleagues have performed numerous experiments showing that social learning is an effective way of learning. In an educational context, teachers are important models for their students. If teachers become role models, their behaviors, professional knowledge, moral standards, values, beliefs, and ideology can be imitated by students and may be internalized.

However, there are competing models in the educational context. For instance, in courses in vocational education, the technical teachers and the lecturing teachers are two potential models from whom students may learn. Thus, it is important to investigate which model will be more influential. In this study, modeling advantage was tentatively offered to shed light on this issue. Modeling advantage depicts the likelihood of a teacher model being imitated by students over other competing models in a particular class. As a result, the teacher with greater modeling advantage will have a higher probability of being imitated. This concept may provide insight into the differential effects of competing models on students' social learning.

Saturday, July 14, 2007

MODELING ALTERNATIVE MOTIVES FOR DIETING

A recent and growing literature considers the economics of weight change and obesity. The leading questions have been "what accounts for the observed rise in obesity over time?" [Chou et al 2004, Lakdawalla and Philipson 2002, Cutler et al 2003] and "why do people (especially rational agents) choose to be overweight?" Because body weight can be adjusted by diet and exercise, "obesity is an avoidable state" and "economists expect these adjustments in behavior to take place if the benefits of adjustment exceed the costs" [Philipson 2001, 1].

But many overweight people prefer not to be overweight, as the existence of a sizable diet industry suggests. Americans pay $40-$100 billion annually to help themselves lose weight. The Wall Street Journal recently reported that "at any time, 29 percent of men and 44 percent of women are on a diet" [Parker-Pope 2003, R-I]. Even if these estimates are high, it is hard to gainsay the fact that millions diet with the aim of losing weight.

This paper asks "why do people diet?" The proximate answer is "to lose weight." But because there are different ways by which a person becomes heavier than he wants to be, the ultimate causes of the decision to diet are different. This has theoretical and empirical implications that we explore with a simple graphical model that determines desired weight and shows how different causes induce dieting.1

The paper proceeds as follows. The first section discusses the physiology of weight determination. The next section sets out a plausible list of diet causes. The following section analyzes these alternative causes within a general production function/utility framework. A simple graphical exposition shows how an individual's optimal weight is determined, yielding propositions about "optimal overweightedness." There follows a section which shows how several of the causes of dieting identified earlier can be usefully analyzed using this graphical framework. A final section presents conclusions about implications, applications suggested by the analysis, and possible extensions.

BACKGROUND: THE PHYSIOLOGY OF WEIGHT DETERMINATION

The production function for weight determination begins with the view that weight gain results when energy (calorie) intake exceeds energy use. Calories are expended in exercise, digestion of food, and "basal metabolic rate" (BMR), the latter being the energy the body expends when at rest. Basal metabolism is in fact the largest source of energy expenditure. A standard result in the physiology/nutrition literature is that BMR declines with age.

That metabolism slows with age suggests the following proposition: if an individual maintains the same level of calorie intake and exercise as he or she ages, that person will gain weight. This happens because energy (calorie) intake is constant, but energy use declines. Indeed, in Suranovic-Goldfarb-Leonard 2002 [hereafter SGL 2002] and Suranovic and Goldfarb 2006 [hereafter SG 2006], we harness a widely used empirical relation from the physiology literature, the Harris-Benedict equations, to obtain numerical estimates of the decline of BMR - and therefore calorie expenditure with age.2 These estimates are then used to generate weight-change scenarios.

The proposition that weight will rise with age even with constant calorie intake is consistent with evidence that weight does in fact rise with age. Costa and Steckel [1997, 55] examine body mass index (BMI) by age (from age 19 to 72) for a number of cross-sections from 1864 through 1991.3 The 1991 cross-section, for example, shows body mass index rising from between 23 and 24 at age 18-19 to between 26 and 27 at age 50-64, then falling to a little below 26 at age È5-79.4 Cutler et al [2003] also find that weight increases with age up to an age between 50 and 55.5

These empirical findings about weight gain with age, and the underlying contribution of falling BMR, provide important information for our modeling of weight choice and the incentive to diet.

A TAXONOMY OF CAUSES OF DIETING

People diet in an attempt to lose weight, but there are varied causes of perceived overweightedness. Understanding possible motives for dieting seems an important step in deepening our ability to analyze dieting phenomena both theoretically and empirically. In this section we provide a provisional taxonomy of diet causes, including: (i) "aging-associated"' dieting; (ii) "disease-provoked" dieting; (iii) "physical-life-eventsprovoked'' dieting; (iv) "style-provoked" dieting; (v) "smoking-cessation" dieting; and (vi) "innovation-provoked" dieting. Brief elaborations follow.

(i) "aging-associated" dieting. This kind of dieting stems from the fact, discussed in the previous section, that weight increases with age, given constant calorie intake. The age-associated weight gain may create incentives to diet. This motive is investigated in both SGL 2002 and SG 2006.

(ii) "disease-provoked" dieting. This kind of dieting stems in the most extreme case from what might be called the "diet or die" motive. An individual is diagnosed with a medical condition requiring that he lose weight to reduce threats to health or even life.

MODELING LEAN, AGILE, AND LEAGILE SUPPLY CHAIN STRATEGIES

The merits of lean and agile supply chain strategies have been much debated among practitioners and academics. While these strategies are often viewed as opposites, this research supports the view that they must not necessarily compete and can, in fact, be employed simultaneously through a so-called "league" approach. Lean, agile, and league strategies are illustrated by modeling their respective applications at a tier-1 supplier to the Heating, Ventilating, and Air-Conditioning (HVAC) industry. Simulation analyses indicate that the lean system excels in customer service performance while the league system results in lower enterprise-wide inventory levels under modeled circumstances. Subsequent analysis suggests that trade-offs exist among the systems in the base case and under varying cost conditions.

The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Air Force, the Department of Defense, or the U.S. Government.

INTRODUCTION

An unprecedented number of companies are pursuing lean management and agility to reduce costs, improve customer service, and gain competitive advantage. "Lean thinking" embraces the elimination of waste in its various forms. Activities that consume resources but generate no redeeming value in the eyes of customers are wastes that must be eliminated in the "lean" paradigm (Womack and Jones 1996). Agility, on the other hand, emphasizes flexible, timely action in response to rapidly changing demand environments (Christopher and Towill 2002).

Lean management has been the subject of best-selling business books over the past decade and the focus of many management training programs as managers seek to make the "lean leap." Agile management, meanwhile, has enjoyed its own share of attention as the mantra espoused by many leading consultancies and technology vendors. Academics have embraced both paradigms as well, with special issues commonly appearing in leading journals dedicated to each philosophy. Beyond indications of passing interest, entire journals are dedicated to the advancement of theory and practice in leanness and agility (e.g., Lean Construction Journal, International Journal of Agile Management Systems).

Though the "lean" and "agile" philosophies are anchored with relatively simple premises, their complexity becomes apparent during implementation. The very requirements and performance outcomes associated with the two approaches are often called into question. It appears that neither paradigm is particularly well understood - even by companies considering their respective adoption and implementation. The ambiguity of the paradigms raises the challenge of determining whether one approach or the other would serve as an appropriate basis for adopting a supply chain strategy. When should a company and perhaps an entire supply chain pursue lean management or agility, and must the question be an either/or proposition? Answering these questions is critical given the significance of aligning the company with its up and downstream supply chain trading partners support of business strategy and key supply chain objectives.

The purpose of this paper is to further the understanding of lean, agile, and hybrid (or socalled "league") supply chain strategies, with particular interest directed toward the dynamics and trade-offs associated with each of the strategies. This objective is achieved by operationalizing the three strategies in a real-world case setting. Simulation research is used to examine the operationalization of the different strategies and to measure the respective performance associated with each, identifying the similarities and differences among strategic inputs and outcomes. The application of simulation to supply chain settings is well established given the stochastic nature of supply chains, where decisions in one area have impact on the others (Bhaskaran 1998; Closs et al. 1998; Disney, Nairn, and Towill 1997; Towill 1996; Waller, Johnson, and Davis 1999). Simulation provides a basis for comparison among alternative strategies and, in turn, enhanced managerial decision-making.

This paper first describes the three supply chain strategies and introduces the research hypotheses. The paper then details the research setting and method, reviewing the operationalization of the strategies in the simulation models. Finally, the paper presents the results and implications for managers and researchers.

A REVIEW OF THE STRATEGIES

This section describes the three supply chain strategies of interest: lean, agile, and league strategies. Each is described in turn.

Lean Supply Chains

Womack, Jones, and Roos (1990) introduced the business world to the premise of lean production in their seminal book The Machine That Changed the World. The book chronicled the operations found in the automotive industry, capturing the dramatic differences in approach and ensuing performance found among the world's leading automakers. In particular, the book examined how the techniques employed by Japanese automakers, namely Toyota, outpaced the performance achieved by U.S. and European competitors. Much has been written in the academic and popular business press about Toyota's much envied competitive weapon, the Toyota Production System (TPS).

Friday, July 13, 2007

Using Plates To Represent Fillets in Finite-Element Modeling

Structural deflections are approximated by use of simplified computational submodels of fillets.

A method that involves the use of fictitious plate elements denoted bridge plates has been developed for representing the stiffnesses of fillets in finiteelement calculations of deflections, stresses, and strains in structures. In the absence of this method, it would be necessary to either neglect the effects of fillets to minimize the computational burden or else incur a large computational burden by using complex computational models to represent the fillets accurately. In effect, the bridge plates of the present method are reduced-order models of fillets that do not yield accurate stresses within fillets but do make it possible to accurately calculate the dynamic characteristics of the structure and to approximate the effects of fillets on stresses and strains elsewhere in a structure that contains the fillets. Such approximations are accurate enough for final modal analysis and preliminary stress analyses.

In a finite-element model according to this method, the model of a fillet includes bridge plates that connect the tangent lines of the fillets. For a given fillet, the bridge plates are characterized by a thickness (t^sub b^) and a pseudo Young's modulus (E^sub b^) to represent the mass and stiffness of the fillet as accurately as possible. It is necessary to calculate t^sub b^ and E^sub b^. in advance, by means of the procedure described in the next paragraph.

One generates two simultaneous nonlinear wide-beam-deflection equations for the rotation at the tangent lines: an equation applicable to the bridge-plate representation and an equation derived from an analytic representation of the fillet. These equations are formulated in terms of the independent variables r/t and t^sub wall^/t, where r is the fillet radius, t^sub wall^, is the thickness of the non-filleted section of a wall adjacent to the filleted section, and t is a thickness variable, the value of which one seeks. The equations are solved numerically to obtain t^sub b^ and E^sub b^. In addition, surface fits of the solutions are obtained for use as the equivalent of closed-form equations for t^sub b^ and E^sub b^.

The method has been verified in calculations pertaining to a representative filleted structure. The bridge-plate model yielded a level of accuracy for the calculation of natural frequencies and mode shapes better than or equal to that obtained by use of a high-fidelity solid model of the fillet, even though the bridge-plate model contained 90 percent fewer degrees of freedom.

Dynamical Modeling of the Relations Between Leisure Activities and Health Indicators

In an extension of maladaptive behavior determinism (MBD) theory, which states that ordered behavior patterns over time are suggestive of disease states, we examined the relation between leisure activity and health behavior over time. MBD is derived from complexity or chaos theory. It was hypothesized that, over time, increased activity levels would be related to more randomly occurring health behaviors. For 68 participants, daily self-monitoring of leisure activities and four health indicators (healthy eating, feeling hassled, positive mood, and drinking alcohol) were assessed for five weeks and modeled using multiple time series methods. Results showed some support for the hypothesis, particularly with respect to the health indicator feeling hassled. The findings extend support for MBD, and also suggest that physically very active leisure time might have health benefits that are dynamical and not necessarily immediately apparent.

The increased appreciation of the benefits of leisure activity and active living (Di Bona, 2000) are well known and reflected in an increase in exercise participation and a positive response to public health promotion efforts (Prankish, Milligan, & Reid, 1998). The benefits of leisure activity include alleviation of anxiety (Kaufmann, 1988), increased well-being (Coleman & Iso-Ahola, 1993), identity development (Kleiber & Rickards, 1985), improved physical and mental health (Calclwell, Smith, & Weissinger, 1992; Winefield, Tiggemann, & Winefield, 1992), and stress-coping benefits (Shaw, Caldwell, & Kleiber, 1996). Prior research often does not distinguish type of leisure activity. However, physically active leisure activities (versus sedentary ones) have been shown to be preventive factors for cardiovascular and other major diseases (Folsom et al., 1997; Mensink, Deketh, MuI, Schuit, & Hofmeister, 1996; Schlicht, 2002; Wenger, 1996). Therefore, inspection of the effect of different types of leisure activities upon health appears to be warranted.

In addition to exploring the effect of type of leisure activity, an issue that has yet to be addressed is the dynamical relation between leisure activity and health. (Prankish et al., 1998; Kleiber, Hutchinson, & Williams, 2002; Mahoney & Stattin, 2000; Mota & Esculas, 2002; Zeijl, te Poel, du Pois-Reymond, Ravesloot, & Meulman, 2000). Dynamical modeling represents how a system changes or "behaves" as time passes and requires repeated measures of variables, such as by daily self-monitoring. This is in contrast to static modeling, which only examines variables at a single point in time. Prior research has relied almost exclusively on static modeling using retrospective self-reports of leisure activity. Given that some health indicators are unstable, such as mood (Hill & Hill, 1991; Thayer, 1996), it is important to inspect their relation to leisure activities over time.

For purposes of this study, we define health in the broadest sense to include not only absence of disease, but also overall well-being (e.g., happiness and quality of life). We define health indicators to include not only physical behaviors related to (or indicators of) health, but mood states as well. Our primary objective was to examine the dynamical influences (i.e., over time) of physical leisure activity on health indicators. This was possible by incorporating daily self-monitoring in a prospective manner for the assessment of all variables (leisure activity and health indicators). Additionally, an open-ended format was incorporated for assessing types of leisure activity, avoiding the problem of pre-determined (and possibly insufficient) categories.

This approach allowed for the testing of an important health theory that has been emerging in recent years: maladaptive behavior determinism (MBD), which is based on complexity or chaos theory. This theory originated through studies of behavior involving laboratory-based biological and psychophysiological assessment over time, such as electrocardiograms (Goldberger & West, 1987). Evidence has suggested that adaptive conditions exhibit a high degree of randomness, whereas maladaptive conditions exhibit more deterministic, and possibly chaotic, characteristics (Babloyantz & Destexhe, 1986; Ehlers, Havstad, Garfinkel, & Kupfer, 1991; Goldberger et al., 1987). However, in studies of behavior requiring self-report outside of the laboratory, comparable evidence has been developing only very gradually (Barton, 1994; Melancon, Joanette, & Belair, 2000).

Nevertheless, the above-cited findings provide support for MBD, which states that adaptive behavior is characterized by a dominant random component because it adjusts to exogenous factors in the environment that occur independently of time (Skarda & Freeman, 1987). Maladaptive behavior is characterized by greater determinism because it responds relatively more to endogenous factors, which do occur as a function of time and perseverate even in the presence of environmental change. So, according to MBD, when an individual is healthy, measures of behavior (indicators) are characterized by unstructured (i.e., random) fluctuations over time. When the individual is unhealthy, these same measures exhibit structured (i.e., cyclic or periodic) patterns over time (Goldberger et al, 1987; Heiby, Pagano, Blaine, Nelson, & Heath, 2003; Skarda et al., 1987). A cycle is simply a pattern of change that repeats itself over time. For example, the rise and fall of the tides follows a cyclic pattern.

Tuesday, July 10, 2007

MODELING APPROACHES IN AVIAN CONSERVATION AND THE ROLE OF FIELD BIOLOGISTS

This review grew out of our realization that models play an increasingly important role in conservation but are rarely used in the research of most avian biologists. Modelers are creating models that are more complex and mechanistic and that can incorporate more of the knowledge acquired by field biologists. Such models require field biologists to provide more specific information, larger sample sizes, and sometimes new kinds of data, such as habitat-specific demography and dispersal information. Field biologists need to support model development by testing key model assumptions and validating models. The best conservation decisions will occur where cooperative interaction enables field biologists, modelers, statisticians, and managers to contribute effectively.

We begin by discussing the general form of ecological models-heuristic or mechanistic, "scientific" or statistical-and then highlight the structure, strengths, weaknesses, and applications of six types of models commonly used in avian conservation: (1) deterministic single-population matrix models, (2) stochastic population viability analysis (PVA) models for single populations, (3) metapopulation models, (4) spatially explicit models, (5) genetic models, and (6) species distribution models. We end by considering the intelligent use of models in decision-making, which requires understanding their unique attributes, determining whether the assumptions that underlie the structure are valid, and testing the ability of the model to predict the future correctly.

Esta revisión surgió al reconocer que los modelos juegan un papel cada vez más importante en conservación, pero son raramente usados en las investigaciones realizadas por la mayoraía de los biólogos que trabajan con aves. En la actualidad se están creando modelos complejos que involucran mecanismos que podrían incorporar más del conocimiento que han adquirido los biólogos de campo. Estos modelos requieren que los biólogos de campo provean información más específica, utilicen tamaños muestrales mayores y que en algunos casos provean nuevos tipos de datos, como demografía en hábitats específicos e información sobre dispersión. Los biólogos de campo deben apoyar el desarrollo de modelos a través de la prueba de los supuestos claves y la validación de los modelos. Las mejores decisiones en conservación ocurrirán al existir una interacción cooperativa y efectiva entre biólogos de campo, biólogos que realizan modelos, estadísticos y personas que trabajan en manejo.

Comenzamos discutiendo la forma general de los modelos ecológicos-heurísticos o que describen mecanismos, "científicos" o estadísticos-y luego destacamos la estructura, las fortalezas y debilidades y las aplicaciones de seis tipos de modelos que se utilizan comúnmente en la conservación de aves: (1) modeles determinísticos matriciales de una única población, (2) modelos de análisis estocásticos de viabilidad poblacional (AVP) para una única población, (3) modelos metapoblacionales, (4) modelos espacialmente explícites, (5) modeles genéticos y (6) modelos de la distribución de las especies. Terminamos considerando el use inteligente de modelos en la toma de decisiones, le que requiere entender los atributos específicos de cada modelo, determinar si los supuestos que subyacen a la estructura son válides y probar la habilidad del modele para predecir el futuro correctamente.

INTRODUCTION

AVIAN BIOLOGISTS INVOLVED in conservation activities encounter formal mathematical and simulation models ever more frequently and in ever more diverse forms. Models are constructed to act as descriptions of ecological systems (Maynard Smith 1974). As in other sciences, such models have driven the development of certain concepts in conservation biology, such as population viability and metapopulation dynamics. Mathematical and simulation models (hereafter "models") have been used to predict outcomes based on past, current, or projected conditions, and they serve a useful role in synthesizing knowledge and guiding research. To make a model, one is forced to state explicitly the relations between external factors and the state of the system, and this quickly reveals the limits of our understanding. More significantly, models have become important tools that are applied to policy decisions, and their use will continue to expand as desktop computing power grows and user-friendly software makes modeling increasingly accessible. Models, however, are neither a panacea nor the only useful kind of analysis for making conservation decisions. Intelligent use of models in decision-making requires understanding their unique attributes, determining whether the assumptions that underlie the structure are valid, and testing the ability of the model to predict the future correctly.

The present review, and a symposium at an American Ornithologists' Union (AOU) meeting sponsored by the AOU Conservation Committee, grew out of our realization that models play an important role in conservation but are rarely incorporated in the research of most avian biologists. For example, at a recent AOU meeting, only -4% of 317 papers presented or tested models, compared with -21% at a meeting of the Ecological Society of America held a few days earlier. Nonetheless, most presenters at both meetings employed a statistical model to test the significance of, or evaluate patterns in, their data. Talking about models with ornithologists evokes strong reactions, as evidenced by the responses of AOU meeting attendees to the question: "What is the first thing that you think of when I say the words 'model or ecological model'?". Answers included " hot air," "money for someone else," "predicting the future," "I go right to the Discussion and hope that they know what they are doing," "people who haven't been in the field enough," "computers," "assumptions and generalizations," "reality?," and "something I don't understand at all."

In pixels and in health: computer modeling pushes the threshold of medical research

Moment by moment, a movie captures the action as a group of immune cells scrambles to counter an invasion of tuberculosis bacteria. Rushing to the site of infected lung tissue, the cells build a complex sphere of active immune cells, dead immune cells, lung tissue, and trapped bacteria. Remarkably, no lung tissue or bacterium was harmed in the making of this film.

Instead, each immune cell is a computer simulation, programmed to fight virtual tuberculosis bacteria on a square of simulated lung tissue. In their computer-generated environment, these warrior cells spontaneously build a structure similar to the granulomas that medical researchers have noted in human lungs fighting tuberculosis.

The simulation, created by Denise Kirschner of the University of Michigan in Ann Arbor, is an example of an emerging technique called agent-based modeling. This new tool in the world of medical research relies on computing power instead of tissues and test tubes. A growing cadre of researchers, including Kirschner, predicts that agent-based modeling will usher in a broadened understanding of complex interactions within the human body.

The agents in the models are individual players--immune cells in the tuberculosis example. Each player is programmed with rules that govern its behavior. Computer-savvy researchers then set the agents free to cooperate with, compete with, or kill each other. Meanwhile, the agents must navigate the surrounding environment, whose properties can vary over space and time.

Scientists can manipulate disease progression within the models by changing the agents or their environment and then watching what happens. As opposed to traditional, biologically based in vivo or in vitro experiments, these computer trials are dubbed "in silico." The results can suggest biological experiments to test the models' findings and may eventually lead to new medical treatments.

Even simple rules assigned to agents can give rise to surprisingly complex behaviors. When many independent agents interact, they create phenomena--such as the granulomas--that can't necessarily be predicted by breaking down the system into its separate components, says complex-systems specialist John Holland of the University of Michigan.

You've got to study the interactions as well as the parts," Holland says.

In-silico modeling differs from traditional mathematical modeling, which uses differential equations to understand how molecules or cells behave in an averaged, continuous way. Instead, the agents of in-silico modeling make independent decisions in response to situations that they encounter. As a result, unusual activity of even a small number of cells can change the entire system's behavior.

Computers can now calculate thousands of interactions with ease, says Alan Perelson of Los Alamos National Laboratory in New Mexico. "Agent-based modeling has only come into its own with the arrival of really powerful computers sitting on people's desktops, within the last 10 or 15 years," he notes.

Pioneered for economics and population-dynamics studies (SN: 11/23/96, p. 332; www.sciencenews.org/pages/ sn_arc99/4_10_99/mathland.htm), agent-based modeling has only recently plumbed the inner workings of the human body, Perelson adds. That's partly because new imaging and genetic techniques are providing crucial data on which agents' rules can be based.

"Agent-based modeling represents a new frontier with respect to how we do science," says surgeon Gary An of Cook County Hospital in Chicago. "In medicine in particular, all the diseases that we're now dealing with are complex problems: sepsis, cancer, AIDS. All these things are disorders of the system as a whole."

INFLAMMATION SIMULATION An, whom Kirschner calls an in-silico "groundbreaker," got into agent-based modeling to help people survive traumatic injuries and major infections.

A leading cause of death for patients in intensive care units, An explains, is a syndrome called systemic inflammatory response syndrome/multiple organ failure (SIRS/MOF), also termed sepsis when it occurs in response to an infection. In this syndrome, the body's inflammatory response rages out of control after a severe injury or bacterial infection. Excessive inflammation can kill a patient by attacking and shutting down vital organs. More commonly, the runaway inflammation paralyzes the rest of the immune response, and the patient then dies of secondary infections.

During the 1990s, researchers performed clinical experiments in an attempt to develop drugs that dampen an overwhelming inflammatory response to injury, An notes. Only one drug, activated protein C, appeared to help patients with SIRS/MOF. An suggests that trials of other drugs failed because they were planned using data representing individual components of the inflammatory response rather than the interactions of the immune system as a whole.

An says, "It's kind of a Humpty Dumpty syndrome, where after you break the system apart, you can't put it back together."

Monday, July 09, 2007

Simulation, Modeling, and Crystal Growth of Cd^sub 0.9^Zn^sub 0.1^Te for Nuclear Spectrometers

High-quality, large (10 cm long and 2.5 cm diameter), nuclear spectrometer grade Cd^sub 0.9^Zn^sub 0.1^Te (CZT) single crystals have been grown by a controlled vertical Bridgman technique using in-house zone refined precursor materials (Cd, Zn, and Te). A state-of-the-art computer model, multizone adaptive scheme for transport and phase-change processes (MASTRAP), is used to model heat and mass transfer in the Bridgman growth system and to predict the stress distribution in the as-grown CZT crystal and optimize the thermal profile. The model accounts for heat transfer in the multiphase system, convection in the melt, and interface dynamics. The grown semi-insulating (SI) CZT crystals have demonstrated promising results for high-resolution room-temperature radiation detectors due to their high dark resistivity (ρ [asymptotically =] 2.8 × 10^sup 11^ ohm cm), good charge-transport properties [electron and hole mobility-lifetime product, µτ^sub e^ [asymptotically =] (2-5) × 10^sup -3^ and µτ^sub h^ [asymptotically =] (3-5) × 10^sup -5^ respectively, and low cost of production. Spectroscopic ellipsometry and optical transmission measurements were carried out on the grown CZT crystals using two-modulator generalized ellipsometry (2-MGE). The refractive index n and extinction coefficient k were determined by mathematically eliminating the ~3-nm surface roughness layer. Nuclear detection measurements on the single-element CZT detectors with ^sup 241^Am and ^sup 137^Cs clearly detected 59.6 and 662 keV energies with energy resolution (FWHM) of 2.4 keV (4.0%) and 9.2 keV (1.4%), respectively.

Cadmium zinc telluride (CZT) has emerged as one of the most attractive and promising materials for room-temperature γ- and x-ray spectroscopy. CZT material has the advantages of high average atomic number (Z [asymptotically =] 50), high density (5.8 g/cm^sup 3^), and wide bandgap 01.50 eV at 300 K), yielding CZT detectors that are highly efficient at room temperature and above.1 Currently used Si and Ge detectors can only work efficiently at liquid-nitrogen temperature, which is expensive and inconvenient. The energy required for generating one electron-hole pair in CZT (~5 eV) is much less than that required for scintillation crystals coupled to photomultiplier tubes (-50 eV), resulting in better energy resolution. CZT materials also have shown improved spectral performance using novel, single-carrier detector designs, such as a Frisch ring,2,3 small pixel effect,4 and coplanar grid.5 Due to these advantages, CZT has been the material of choice for x- and γ-ray detectors for medical imaging, infrared focal plane array, national security, environmental monitoring, and space astronomy.6-9

Although tremendous efforts have been made to grow large, high-quality CZT crystals, the production of CZT, dominated by the high-pressure Bridgman method, suffers from low yields and small device sizes. Current CZT growth technology continues to endure problems, including easy defect formation, such as grains and twinning, precipitation and inclusion of Cd and Te, cracking due to thermal stresses, and nonuniform crystal composition caused by zinc segregation. During growth, the quality of the as-grown crystal is significantly influenced by complex transport phenomena taking place in the furnace. The melt flow driven by the buoyancy force has been realized to significantly affect the solid/melt interface shape and dopant impurity distribution in the as-grown crystal, giving rise to radial and axial segregation that adversely affects device quality.10 In addition, the inhomogeneous temperature distribution as well as wall contact can cause mechanical stresses in the crystal and result in a high dislocation density. Achieving the dopant uniformity in the grown CZT crystal requires precise control of the melt flow and heat and mass transfer in the growth system.

An alternative modified vertical Bridgman growth technique for CZT crystals has been adopted at EIC to produce large-volume detector-grade single crystals in high yield. The growth process has been studied numerically using an integrated model that combines formulation of global heat transfer and thermal elastic stresses. Using the elastic stress submodel, thermal stresses in the growing crystal caused by the nonuniform temperature distribution can be predicted. Special attention is directed to the interaction between the crystal and the ampoule. The global temperature distribution in the furnace, the flow patterns in the melt, and the interface shapes are presented herein.

The CZT crystal grown from zone-refined (ZR) precursor materials using the growth furnace at EIC showed very good charge-transport properties, i.e., high mobility-lifetime product for electrons while maintaining high bulk resistivity. Spectroscopic ellipsometry and optical transmission measurements of grown CZT crystal using two-modular generalized ellipsometry (2-MGE) are presented. Detection performances of CZT detectors with ^sup 241^Am and ^sup 137^Cs are also reported.

Computational Modeling of Extracellular Mechanotransduction

Mechanotransduction may occur through numerous mechanisms, including potentially through autocrine signaling in a dynamically changing extracellular space. We developed a computational model to analyze how alterations in the geometry of an epithelial lateral intercellular space (LIS) affect the concentrations of constitutively shed ligands inside and below the LIS. The model employs the finite element method to solve for the concentration of ligands based on the governing ligand diffusion-convection equations inside and outside of the LIS, and assumes idealized parallel plate geometry and an impermeable tight junction at the apical surface. Using the model, we examined the temporal relationship between geometric changes and ligand concentration, and the dependence of this relationship on system characteristics such as ligand diffusivity, shedding rate, and rate of deformation. Our results reveal how the kinetics of mechanical deformation can be translated into varying rates of ligand accumulation, a potentially important mechanism for cellular discrimination of varying rate-mechanical processes. Furthermore, our results demonstrate that rapid changes in LIS geometry can transiently increase ligand concentrations in underlying media or tissues, suggesting a mechanism for communication of mechanical state between epithelial and subepithelial cells. These results underscore both the plausibility and complexity of the proposed extracellular mechanotransduction mechanism.

Cells often communicate through the exchange of extracellular autocrine and/or paracrine signals. Changes in the local levels of these molecules, derived from alterations in production, metabolism or transport, are dynamically sensed, allowing cells to respond appropriately to their microenvironment (1). We have proposed a mode of mechanotransduction whereby cells respond to changes in the local extracellular concentration of autocrine ligands that are caused solely by deformation of the extracellular space (2). In support of this hypothesis, we demonstrated that the extracellular space in cultured human bronchial epithelial cells deforms under transcellular compressive stress, and that an autocrine ligand-receptor signaling loop is activated by the same mechanical stimulus (2). The essential components of autocrine ligand-receptor circuits are frequently found to be constitutively expressed and colocalized in the basolateral compartment of epithelial cells (3). In our previous work, a simple analytical relationship was derived to predict the steady-state ligand concentration in the local extracellular space before and after mechanical loading (2). While this steady-state analysis was essential in establishing the plausibility of the extracellular mechanotransduction mechanism, it could not address the kinetics of the process, and omitted potentially important effects of convection.

Here we develop a generalized finite-element solution of the one-dimensional diffusion-convection equation to evaluate the temporal changes in ligand concentration occurring in a dynamically collapsing interstitial space between epithelial cells. We introduce a new geometry for the model that accommodates the diffusion and convection of ligands shed into the lateral intercellular space, which is continuous with an underlying media reservoir. Employing the model, we explore the parameter space of the governing equations, examining the effect of ligand diffusivity, shedding rate, and rate of extracellular space change on the kinetics of ligand accumulation. The new model geometry reveals the transient effect of convection on ligand concentration changes in the underlying space (e.g., media for the in vitro case or tissues in vivo), suggesting a potential mechanism for communication of a change in the mechanical state of the epithelium to underlying tissues. Moreover, the model offers a novel explanation for how cells could discriminate between mechanical processes occurring over a range of rates in different physiological scenarios. We use insights gained from the model to propose two explanations for a selective contribution of the EGF family-ligand heparin-binding EGF (HB-EGF) to the transduction of mechanical stress via autocrine signaling in a collapsing extracellular space.

The transitional regime extends to a distance R^sub t^ = w/π from the LIS boundary. This distance was determined by matching the fluxes corresponding to Cartesian (w) and radial (πR^sub t^) lengths, through which the flux passes. We further approximate the velocity field in this domain as uniform, being equal to the bulk velocity at the LIS exit V^sub t^ = V^sub x^(x = h). The transitional regime was included to avoid numerical difficulties that can occur when switching coordinate systems. The approximations made in this domain have little impact on the overall concentration profile inside and outside of the LIS (data not shown).

The radial domain encompasses the region between R^sub t^ (end of the transitional domain) and R^sub 0^ = h/2 (where we assume the ligand concentration to be zero). Mathematically, the zero-concentration boundary would be infinitely far away from the LIS (i.e., R^sub 0^ [arrow right] ∞), but for efficient numerical simulations we determined that for a LIS height h = 15 µm (2), R^sub 0^ = 7.5 µm is sufficiently far away from the LIS boundary such that further increasing R^sub 0^ had little effect on the overall concentration profile (data not shown). Hence, for all of the simulations we fixed the value of R^sub 0^ = 7.5 µm to be half of the previously measured LIS height ft = 15 µm (2).

Friday, July 06, 2007

Modeling of Optical Response in Graded Absorber Layer Detectors

Commercial and military applications for ir detectors continually push the performance envelope to achieve the highest signal-to-noise ratio at the longest possible wavelength. The highest performance is generally achieved when the detector cutoff wavelength is made as short as possible to minimize dark current while maintaining high uniform responsivity over the system optical passband. This requires an accurate prediction of the detector spectral shape near cutoff. As the incident wavelength approaches cutoff, the optical absorption coefficient decreases and internal reflections within the layer become important, causing constructive or destructive interference and consequently modulation of the detector spectral shape. The analysis is complicated since the absorber layer often includes compositional grading, particularly for very long-wavelength IR devices. This article describes an approach for analyzing the spectral response of backside-illuminated, compositionally graded detectors, and the results are compared with experimental data. The analysis shows that the spectral shape near cutoff is impacted by a weak resonance within the absorber created by a combination of a strong reflection from the detector front side and a weak reflection from the absorber-substrate backside interface. The analysis, which is an extension of the work of Rosenfeld et al., takes into account the weak resonance through computation of the optical field and generation rate by the Wetzel-Kramers-Brillioun (WKB) method. The method has been compared with experimental data for several cases using both parabolic and hyperbolic models for the absorption coefficient, and excellent agreement is achieved with the hyperbolic model.

POST-HARVEST RIPARIAN BUFFER RESPONSE: IMPLICATIONS FOR WOOD RECRUITMENT MODELING AND BUFFER DESIGN1

Despite the importance of riparian buffers in providing aquatic functions to forested streams, few studies have sought to capture key differences in ecological and geomorphic processes between buffered sites and forested conditions. This study examines post-harvest buffer conditions from 20 randomly selected harvest sites within a managed tree farm in the Cascade Mountains of western Washington. Post-harvest wind derived treefall rates in buffers up to three years post-harvest averaged 268 trees/km/year, 26 times greater than competition-induced mortality rate estimates. Treefall rates and stem breakage were strongly tied to tree species and relatively unaffected by stream direction. Observed treefall direction is strongly biased toward the channel, irrespective of channel or buffer orientation. Fall direction bias can deliver significantly more wood recruitment relative to randomly directed treefall, suggesting that models that utilize the random fall assumption will significantly underpredict recruitment. A simple estimate of post-harvest wood recruitment from buffers can be obtained from species specific treefall and breakage rates, combined with bias corrected recruitment probability as a function of source distance from the channel. Post-harvest wind effects may reduce the standing density of trees enough to significantly reduce or eliminate competition mortality and thus indirectly alter bank erosion rates, resulting in substantially different wood recruitment dynamics from buffers as compared to unmanaged forests.

Thursday, July 05, 2007

CAD Software facilitates Total Modeling technique

Along with preview mode and ability to use morphing technology to add draft to models, PowerSHAPE v6 offers toolbars for analysis and model repair functionality. Its Total Modeling technique enables users to add logos, textures, and other decoration to CAD models, and triangle models can be incorporated into surface or solid models. In addition to built-in Wizard, software provides model compare functionality and ability to produce 360[degrees] wrap around cone or cylinder.

********************

The latest version of Delcam's PowerSHAPE CAD system has made its unique Total Modelling functionality much easier to use. Other enhancements within PowerSHAPE 6 include new toolbars for the software's analysis and model repair functionality, and the ability to use the morphing technology to add draft to models.

Total Modelling is Delcam's unique method for adding logos, textures and other decoration to CAD models. It enables triangle models, including those generated within the ArtCAM engraving program and the CopyCAD reverse engineering system, to be incorporated into surface or solid models created in PowerSHAPE.

The latest release includes a new Wizard to both speed and simplify the process. This allows the user to determine the exact position and orientation of the wrapped decoration, and then adjust its size and aspect ratio to give the required effect. A preview mode illustrates the final effect before the full calculation is undertaken.

In another enhancement, the software is now able to produce a complete 360[degrees] wrap around a cone or cylinder. This will be especially valuable for the decoration of the sides or shoulders of bottles, one of the main applications for Total Modelling.

The new toolbars cover all PowerSHAPE's options for the analysis and repair of imported models. The analysis tools include smoothness shading, curvature analysis and draft angle shading, plus a new wall thickness shading command, together with model compare functionality to detect modifications between different versions of the same part. The model repair functionality is equally comprehensive, including options to remove duplicate surfaces, re-trim surface edges and stitch small gaps. Grouping these two sets of commands together will make it easier to find the required command and give a more logical workflow when moving from component design to tooling design.

The ability to use morphing to add draft will solve one of the most common problems when preparing designs for manufacture - the lack of sufficient draft to remove the part from its tooling easily. While this might seem trivial, if such a change is not covered in the model's history tree, it can often require the design to be recreated totally.

With the new option in PowerSHAPE 6, morphing can be used to apply draft to multiple surfaces within a surface or solid model simultaneously. In addition, where required, draft can be applied to the external surfaces of the part without affecting the internal surfaces.