Welcome to Modeling | Modeling News | Modeling Videos | Latest Modeling Trends


Friday, September 21, 2007

Remodeling a House – Change, Improve, Add New

Remodeling a house could take a lot of money, tons of preparation, heaps of effort, and a great deal of smarts. You ask, I can start remodeling a house but where do I start? How? What can I do? – A lot, based solely on balancing your needs and wants.

Remodeling a house could mean the simplest things of changing your home’s design-theme, adjusting its color palette, or taking in a more updated shelving. Remodeling could involve changing a room’s entire function, expanding your floor-space, or using up standing spaces. Remodeling a house could even include addition of a totally new area, a swimming pool, spa, fitness room, a porch, breakfast nook, game room, bar, a den, a walk-way, terrace, garage, a garden or a home office. Or, remodeling a house could only be simple maintenance processes such as repainting the roof, replacing old pipes, freshening up the floors. Best and most frequent, home remodeling improvements could comprise of putting into your home those necessities you never had before: waterproof basement, bath tub, more baths, dressing room, nursery, guest room, more rooms, secured doors and windows, needed storage, fire exits, electrical floor warming system, kitchen ventilation, even fire place. As often as the seasons’ changes, your home also needs important changes to accommodate more needs, more people, more activities, more comforts, and more keeping up with the times.

Remodeling a house could set you in the best whistle-while working mood if you start with your front entry. Yes, the front entry, one of the few simple changes requiring little or no effort but making a great impact to your home. And after all, this is the space that greets family and friends when they come to visit. They might even think themselves in a totally different property once they step out of their cars. You can add a charming antique patina mailbox at the curb starting the way into the house, or have an ivy-sewn lattice. A new coat of paint could easily do wonders to your front door, plus, new hardware, chain latch, traditional doorbell, or knocker. And then a stunning matching outdoor scones or lanterns, corner displays of potted harvest flowers or herbs, and a cozy outdoor sitting area. Make visitors feel that there’s more to see inside. Remodel more parts that need an instant boost. Why not start consulting with construction firms, architects, designers. Some home construction teams can even supervise the work to be done for your home and coordinate with several other key people to add more value and homey-feel to your house.

Go ahead, if you have the power then do it – turn your average home into that charming dream house you have always wanted!

Construction Loan for the Cost of Home Remodeling

You must be asking how much cost of home remodeling is too much compared to the amount you will gain if you sell. How could you keep within your budget once you get too excited spending the little or no money you have in adding value to your home? You might just end up realizing that the daunting cost of home remodeling could put you in a spot where you would need to recourse for a more challenging construction loan. However it all might just turn out more than you hoped for, more than the amount inside your pocket.

First to ask is if you are in the first place qualified for a construction loan to lift some weights off of your cost of home remodeling. Construction mortgage companies require that you own your lot. This is to ensue that the bank has a way to get back their investment or collateral if you fail to accomplish the mortgage payment; if not then you might need to pay a premium. Then you can proceed to the 1-year loan plan to provide for your cost of home remodeling. The good thing now is that you are not obliged for the full construction loan amount. You only have to pay interest for the amount you actually borrow each draw (which have service charge for each) so you won’t really be carrying the full mortgage until the end of your construction. Still, you need some ready cash to pay your contractors to keep them working knowing full well they won’t get fully paid until the work is finished. Best of all, you need to learn how to budget.

Be sure to account all expenses to be drawn, from the contractor, plumbers, electricians, masons, excavator, landscaper, designer, etc. to the material costs for the drainage, windows and rood, even paints. Put into order these quotes. Also notice that the mortgage company will not agree to your loan unless you have a cost overrun buffer. And then you need to supply permits, survey, and a copy of your floor plan so their appraiser could inspect your property and determine whether you project could appraise for the amount of your loan.

So far these will be the immediate cost of home remodeling via a construction loan. Remember that you also have the option of returning the money you didn’t get to use. Just make sure to follow the rule – ‘estimate high but spend less’. This will let you sleep at night and in the end, you might just be please to gain the extras.

Balancing Style and Worth with Your Home Remodeling Costs

The classic way to increase the value of your house is spending some home remodeling costs for your existing rooms or additional floor plans. But since you’re spending, why not make the most of what you can get?

You have the choice whether to add new appliances, or cabinets or fixtures; or you can go way until only some few steps to the edge if you have enough home remodeling costs. What’s important though is for you to reach the most desirable look or use you want for your home. You can start by choosing the right space to remodel.

Home remodeling costs are proven more worth it if spent on new additions in locations that are in a completely different architectural style – the kitchen and the bathroom. This way you can always protect the character of the rest of your home by keeping it within the existing framework, while transferring added attraction instead to the kitchen or bath.

And you don’t really need to buy the most expensive materials for your new additions. You can keep your home remodeling costs within the budget of only simple repairs that are made to last – a practical and clean style so to speak. But of course you can always go for some simple luxuries. You can have your bathroom enlarged or take-in a sunken whirlpool, tub, modern showers, or extension spa or dressing rooms. Same thing for the kitchen – you can simply replace broken tiles or prop-up a charming breakfast nook, a new kitchen island, or place elegant granite countertops for more drama.

In contrast, the most financially unrewarding areas putting in more home remodeling costs are those directly above-ground areas such as the basement, garage, yard, or walkway (unless you’re turning them into a completely new space). Adding swimming pools should also re-considered over and over since its cost is high enough and which should even include maintenance outlay. It may all be worth it in the end but right now, consider the cost.

You can easily remodel just about any part of your house without placing too much on your overall home remodeling costs. Replacing worn carpeting, tiles, and wood floors will give you the immediate advantage you’re looking for. You can also just update the paint colors or add new wall coverings. Also bear in mind that there are many stylish and trendy items at discount prices found at garage sales and outlet stores. Who knows, you may even be fortunate to find a superb piece of cabinetry that fits perfectly well with your kitchen’s theme. In fact, you can even recycle your stuff out of its context and into a completely fresh function. Just keep in mind that the true effectiveness of a space comes from the harmonious balance of all of its components. You may not always stay simple but be consistent. Then, even the smallest and cheapest changes in your home can make a world of difference.

Friday, September 14, 2007

Occupancy Estimation and Modeling

Occupancy Estimation and Modeling.Darryl I. MacKenzie, James D. Nichols, J. Andrew Royle, Kenneth H. Pollock, Larissa L. Bailey, and James E. Hines. 2006. Elsevier-Academic, San Diego, California, xviii + 324 pp., 23 figures, appendix. ISBN 0-12-088766-5. Hardback, $64.95.-Species presence and absence (i.e., occupancy) data are increasingly being used by avian biologists to assess the status, distribution, and dynamics of bird populations (e.g., Oison et al. 2005, Tornberg et al. 2005, Karanth et al. 2006) and for developing conservation strategies (e.g., Freemark el al. 2006, Jiguet and Julliard 2006). Unfortunately, complete detection of a species is usually impossible, and the ability to detect a species is often related to species-specific traits and the physical characteristics of sample units (reviewed in Thompson 2002). Consequently, incomplete detection can bias occupancy estimates and impede the ability to make sound conservation decisions. Several methods have recently been developed to incorporate incomplete detection in occupancy models (MacKenzie et al. 2002, 2003, 2004; Dorazio and Royle 2005; MacKenzie and Royle 2005). These pioneering efforts, however, have been presented as separate works and often in a manner that was difficult for all but the most technically savvy to understand. This book is an attempt to synthesize existing ideas on occupancy estimation and modeling in a form that is understandable to biologists and ecologiste without strong statistical backgrounds.

The book is a well-organized and comprehensive treatment of occupancy estimation that entails multiple aspects including sample design, analysis, and interpretation. The first three chapters cover basic ecological and statistical background and introduce terminology and concepts that are used throughout the book. Chapter 1 is a philosophical treatment of the nature of science and management and the role of field surveys and monitoring. This philosophy is reflected in much of the material presented throughout the book. We believe that such context is important and often lacking from statistics-oriented texts. Chapter 2 provides an overview of the ecological aspects of occupancy and includes a description of metapopulation dynamics. Although the chapter is not intended to be a thorough review, the authors have done a commendable job compiling and synthesizing an abundance of information on metapopulation dynamics as it relates to occupancy estimation. Chapter 3 is an excellent and thorough review of the basic principles of statistical estimation and inference that should prove useful for professionals and for graduatelevel instruction. The chapter thoroughly details all aspects of parametric statistics: maximum likelihood estimation, hypothesis testing, goodness-of-fit, and model selection. However, none of these topics is covered in relation to Bayesian methods. As such, readers will have no basis for evaluating the goodness-of-fit, convergence, and selection of Bayesian models. Yet later chapters include computer code for fitting Bayesian occupancy models. We believe that this is a potentially hazardous combination and hope that the authors can remedy the problem in future editions.

Chapters 4-7 cover single-species occupancy estimation and gradually build from relatively simple, constant detection-probability estimation (chapter 4) to more complex, multiple-season models (chapter 7). Each chapter begins with a useful general introduction and explanation of purpose. Models are then derived in logical sequences, with sufficient mathematical details and clear, concise explanations that should satisfy and enlighten biologists, whatever their level of statistical proficiency. Each chapter contains at least two examples that are used to illustrate model fitting, parameter estimation, and the presentation and interpretation of results. The material in this section of the book is very thorough and is generally presented in a logical sequence. Chapter 6, which covers the design of single-season occupancy studies, includes a thorough evaluation and discussion of factors that are crucial for developing efficient and effective occupancy studies (e.g., study site selection, allocation of sampling effort). However, the chapter is probably of limited use for developing monitoring designs. Chapter 7 (multiple-season models) provides useful study-design guidance that is relevant to monitoring (e.g., the limitations of a rotating panel design), but lacks detail on statistical power. A general treatment of study design that included the details in both chapters 6 and 7 would have been preferable.

Chapters 8 and 9 deal with two ways to investigate multiple-species occupancy patterns: (1) interactions among a small number of species (chapter 8) and (2) changes in species richness (chapter 9). As the authors acknowledge in the introduction to chapter 8, these two chapters are not as well developed as previous sections of the book, providing few, if any, examples for each method. The lack of associated software and example code in this section, with the exception of the two-species interaction model implemented in PRESENCE, version 2 (Hines 2006), will limit the use of these methods to statistically savvy readers with knowledge of computer programming. Despite the lack of implementation detail and the paucity of examples, the authors do an excellent job, as in previous sections, in presenting the material in a logical order and clearly deriving and explaining all models in a way accessible to all biologists. Analysis of occupancy data at the community level is also a very active area of research, and we expect user-friendly software implementing many of the methods described here to become available in the near future.

It contains short sections detailing possible models and approaches (most of which are untested) for multipleoccupancy states and integration of habitat, abundance, and marked animals in occupancy estimation.

Most of the examples in the book were analyzed using PRESENCE, which is freely available via the internet and includes copies of the example data sets. However, the book is neither intended nor appropriate for instruction in using PRESENCE. Several examples also include code for fitting Bayesian models with WINBUGS, version 1.4 (Spiegelhalter et al. 2003). Their use will require familiarity with the software and Bayesian analysis methodology.

MODELING DISABILITY IN LONG-TERM CARE INSURANCE

Long-term care (LTC) costs and, in particular, those arising under an LTC insurance contract, are difficult to estimate. This is because of the complex effects of the processes of aging-disability and cognitive impairment. As disability is a gradual, as opposed to a discrete, process, and as the effects are sometimes reversible, a fairly complex model is necessary to capture its nature. This paper concentrates on modeling the disability process of aging only and, in particular, fully incorporates the recovery process as dictated by the data. With the recovery process modeled, the effect on the estimated model costs of disability of the common simplifying assumption that recoveries can be ignored is easily assessed.

This paper has twin objectives: (1) to present novel methodology, the penalized likelihood, for using interval-censored longitudinal data, such as the National Long-Term Care Study, to parameterize Markov models; and (2) to estimate the costs arising under an LTC insurance contract in respect of disability. The model is also used to show that ignoring recovery from disability can lead to significant overestimation of LTC insurance costs-suggesting that claims underwriting in LTC insurance may be an important factor in managing claims costs.

1.Modeling Long-Term Care Costs Modeling the processes that lead to claims under a lone-term care (LTC) insurance policy is complex. There are different underlying causes of claiming (e.g., physical versus mental deterioration), and the processes being modeled may involve progression through a number of states of health (e.g., progression through states of varying disability), rather than being binary as in life insurance (alive-dead), which, in turn, make it difficult to specify objective claims criteria. In addition, events leading to claiming are often reversible (e.g., people can recover from some types of disability)-a factor that can be difficult to allow for, unless a suitable modeling framework is used. Indeed, previous researchers have often assumed that recovery from disability is not possible, or their models incorporate it in a very approximate manner (Alegre et al. 2002; Dullaway and Elliot 1998; Haberman and Pitacco 1999; Nuttall et al. 1994; Rickayzen and Walsh, 2002). The continuous time model proposed in this paper has no such restriction, and recovery from disability is fully incorporated in the model. This allows the effect of ignoring recoveries to be quantified, as well as provides some insight into the importance of claims underwriting in LTG insurance.

In the United States, interval-censored longitudinal data on disability have been available since the mid-1980s, from the National LongTerm Care Study (NLTGS) in 1982, 1984, 1989, and 1994. Interval censored means individuals in the study are interviewed at fixed time points, resulting in their disability status being known at these points in time, with the number and timing of changes in some status of interest (e.g., disability status) unknown. These extensive studies, undertaken at great cost, include data on more than 35,000 lives-making them unique in scope. There were linked series of questionnaires establishing loss of activities of daily living (ADLs) and instrumental ADLs, and institutionalization. In a Markov framework, as used in this paper, interval-censored longitudinal data give rise to estimates of a transition probability matrix over an extended time period. There is then the problem of how to "convert" these probabilities, within the Markov framework, to realistic (positive and real) parameter estimates (transition intensities), which are more useful for actuarial applications. The method we propose is intuitive, produces reasonable estimates, and is flexible enough to work where other methods cannot be applied.

2.The goal of this paper is to describe and parameterize a continuous-time Markov model of the disability process and to look at the cost of disability under an LTG insurance contract. The data used to estimate the model parameters are from the 1982, 1984, 1989, and 1994 NLTCS in the United States (1997 NLTGS Public Use GD)-but this paper focuses on the work done with the 1982 and 1984 NTLGS in particular. Though not discussed in this paper, NLTCSs have also been undertaken in 1999 and 2004.

We start by introducing LTG insurance contracts in section 2. Then in section 3, after describing the data and some previous research that has used the data, the model and statistical framework are introduced. In section 4, after discussing the reasons standard maximum likelihood estimates cannot be calculated from the data, we look at how previous researchers have dealt with this problem. A novel method is then proposed and implemented to obtain estimates of the model's parameters. In section 5, confidence intervals for the parameters are estimated and then used in the graduation process. We use the parameterized model in section 6 to calculate the expected present value (EPV) of model LTG benefits in respect of disability. In particular, we look at single premiums for a range of sample policies and investigate the effect on model premiums of ignoring recovery from disability. Conclusions and discussion are provided in section 7.

The same measures of disability were used in all years and were defined by the inability to perform one or more of eight instrumental activities of daily living (IADLs, including light housework, laundry, meal preparation, grocery shopping, getting around outside, getting to places outside within walking distance, money management, using the telephone) or one or more of six ADLs (eating, getting in and out of bed, getting around inside, dressing, bathing, getting to the bathroom or using the toilet) without using personal assistance or special equipment.

Interleaving Modeling and Writing Activities in Systems Analysis and Design

A Systems Analysis and Design course should develop both the technical as well as interpersonal skills of each student. Each student must be able to develop and use the various lifecycle models and be able to communicate with end users through these models. By creating interleaved modeling and writing assignments within the Systems Analysis and Design course both objectives can be met. This paper presents a series of integrated modeling and writing assignments-used in a Systems Analysis and Design course-that have been developed to enhance both the technical and interpersonal skills of an IS student.

The Systems Analysis and Design course within Information Systems curriculum provides the student with the skills necessary to analyze and design information systems (Gorgone, Davis et al. 2003). One of the major objectives of this course is to have the student develop and use each of the models-either structured or OO-in the Systems Development Life Cycle.

A second objective is to make each student aware of the interpersonal skills necessary for successful systems development (Guinan and Bostrom 1986; Gorgone, Davis et al. 2003). In particular, the Systems Analysis and Design course should emphasize "the factors for effective communication and integration with users" (Gorgone, Davis et al. 2003, pg. 29). In fact, the models developed in the SDLC are rendered useless unless "effective communication patterns are used by developers and users".

These two objectives-model development and interpersonal/communication skills-are met simultaneously through series of assignments developed for a Systems Analysis and Design course. In the course, the student is required-individually and then as part of a group-to develop a series of SDLC models and write a corresponding memo that explains the purpose, use, and their understanding of each model. This article describes how these assignments are used to meet these two learning objectives simultaneously.

In a typical Systems Analysis and Design course, topics range from planning to design and development activities, including the implementation of a database or other information system. However, this course is taught over a seven week period so only the activities within the planning, analysis, and design stages are addressed. The focus of the course is on the first objective-the development and use of the models in the structured approach-however, the course is regarded as a writing intensive course by the University; therefore a significant writing component must be incorporated into the course.

Each student is given a series of four models to develop throughout the course. The models are for economic feasibility (return on investment, breakeven analysis, and net present value), data modeling (an entity-relationship diagram), process modeling (a dataflow diagram), and database design (a database schema). As part of each modeling assignment, the student creates a two page memo that explains the purpose, use, and specifics of the corresponding model in their own words. These individual modeling and memo-writing assignments are done using a straightforward case adapted from a textbook (Satzinger, Jackson et al. 2004). Each assignment is then graded and returned to the student.

In order to assess the learning from the initial assignment, the same assignment, using a more complex case study (similar to ABC Churchb is completed by student groups; typically 2-3 students per group. Each student group develops the model, and writes a corresponding memo, for economic feasibility, data modeling, process modeling, and database design. In this way, concept learning is assessed.

1.The opening paragraph of the memo should state-in one to two sentences maximum-the recommendation on proceeding with the project. This recommendation needs to be clearly stated and include a statement on the number of years that the project should remain feasible; it may be less than the seven years.

2. The second paragraph justifies of the recommendation using the results of the feasibility calculations. This is an assessment of the student's knowledge of what ROI, NPV, and BEA indicate.

3. The third paragraph incorporates the intangible benefits of the project. In class, it is noted that intangible benefits are likely to be as important as the tangible benefits. The intangible benefits are to be used as further evidence toward the recommendation. This assesses the student's use of qualitative information in making a recommendation.

4. The final paragraph includes a list of action items (modeling activities) that need to be performed based on their recommendation. In this paragraph, the student's knowledge of the next steps within the SDLC is tested because the action items describe what models need to be developed next.

Note the benefits and costs in the individual assignment are designed so that the breakeven point occurs within the seven year period and the final NPV and ROI are positive. However, in the group assignment, the benefits and costs are modified so that the breakeven point is reached within the seven years, but the NPV and ROI at the end of seven years are negative; this forces the student group to consider a shorter project life recommendation.

2.2 Assignment 2: Entity-Relationship Diagram (ERD)

The second assignment requires the development of an Entity-Relationship Diagram (ERD). After developing the ERD, another memo, addressed to a "CEO-level" client about their development efforts surrounding the ERD is created.

1. The opening paragraph describes the purpose and role of the ERD within the development process. The student assumes that the client knows very little about database design and data modeling therefore it is important to state clearly the purpose and need for the ERD.

2. The second paragraph specifies the entities specified in the ERD and the rationale for collecting data on these entities.

3. The third paragraph describes the relationships that exist between the entities. The student must describe what the relationship is and its significance in the problem.

4. In the fourth paragraph the cardinality of each relationship and significance of the cardinality is discussed.

5. The final paragraph again consists of action items, i.e., modeling activities that need to be performed beyond the ERD.

2.3 Assignment 3: Data Flow Diagram (DFD)

The third modeling and writing assignment is to create a context diagram (Valacich, George et al. 2004); a high level DFD. After developing the context diagram, another memo, addressed to a "CEO-level" client about their findings with respect to the system scope is created.

1. The opening paragraph describes the purpose and role of a DFD, specifically a context diagram, within the SDLC; the same assumption about client knowledge of the process is made.

2. The second paragraph specifies the external agents that interact with the proposed system.

3. The third paragraph describes the nature of the interactions between the external agents and the system. The student must describe what type of information is either being provided to the system or requested from the system by each of the external agents.

4. The final paragraph again contains of a statement of action items. These activities must include some discussion of the functional decomposition of the system that still must take place, i.e., the development of lower level data flow diagrams.

2.4 Assignment 4: Database Schema

The final assignment involves the development of the database schema. The database schema is an important data model developed in the design phase of the SDLC that acts as the blueprint for the database itself (Valacich, George et al. 2004). After developing the database schema, the student must explain their model to the "CEO-level" through a two page memo which contains the following:

1. The opening paragraph of the memo describes the purpose and role of the schema. The student must also describe the relationship between the schema and the ERD.

2. The second paragraph begins the discussion of how the ERD was converted into the corresponding database schema by describing how each of the entities of the ERD was transformed into a database table including the identification of primary keys.

3. The third paragraph describes how the relationships of the ERD are represented within the database schema; this is primarily the discussion of the use of foreign keys.

4. The fourth paragraph explains, on a table-by-table basis, the normalization process and the transformation of each table into third normal form.

5. The final paragraph is the same as in each of the previous assignments, a discussion of the activities that need to be completed beyond the design stage.

Modeling Saves Time, Money

Advanced 3-D modeling software tools for creating complex parts can save best-in-class manufacturers 99 days in time-to-market, as well as dramatically lowering product development costs, compared to average manufacturing performers, according to a recent study by market researcher Aberdeen Group Inc. (Boston).

Simulation tools employing 3-D models allow best-in-class manufacturers to pay $50,637 less in product-development costs than average performers, the study notes. Because virtual prototyping reveals design issues online, manufacturers using 3-D software typically require 1.4 fewer physical prototypes than average performers, according to Aber- " deen Group, and 6.1 fewer change orders than laggards, resulting in dramatic product-development time and cost savings.

Aberdeen's study, Transition from 2-D Drafting to 3-D Modeling Benchmark Report, quantifies the benefits of migrating to 3-D design software. The study also discusses the current pressures and challenges in the marketplace that make migration difficult. "To remain competitive, manufacturers migrating from 2-D drafting must maintain user productivity in the face of increasing demand for more products and more complex products and shorter time-to-market windows," notes Chad Jackson, the report's author and service director of Aberdeen's Product Innovation and Engineering practice. Best practices of bestin-class companies help them meet product development targets for revenues, costs, launch dates, and quality 84% or more of the time.

Friday, September 07, 2007

CAE Software delivers in-depth finite element modeling

Featuring Windows user interface, Femap[R] v9.1 pre- and post-processing modeling application is integrated with Nastran solver technology and provides customization and automation tools. Associative interface with Solid Edge v18 facilitates geometry transfer in preparation for analysis. Along with direct output of JT files, software features quadrilateral element meshing option, integrated BASIC development environment, and multiple CAD integration capabilities.

CINNCINNATI - UGS Corp., a leading global provider of product lifecycle management (PLM) software and services, today announced Version 9.1 of Femap[R], its robust pre- and post-processing finite element modeling application that serves as a key component of the UGS Velocity Series the company announced yesterday.

UGS made the announcement at its 2005 Solid Edge User Summit and Executive Symposium.

Femap is the finite element analysis (FEA) component of UGS' new mid-market portfolio, UGS Velocity Series, the industry's first comprehensive, preconfigured portfolio of digital product design, analysis and data management software.

"Femap is focused on bringing analysis closer to the design process," said Alastair Robertson, manager of Femap Marketing, UGS. "Femap makes FEA more accessible and easier to use by experts and occasional users alike, while maintaining the integrity of the analysis. With the combination of Femap's new native Windows user interface, a newly established Femap Express integrated in Solid Edge Version 18 for fast, yet accurate analysis, UGS' renewed focus on computer-aided engineering analysis (CAE), and our new global reseller channel program, Femap is poised for explosive growth over the next several years."

Femap Version 9.1 offers in-depth finite element modeling functionality that allows access to advanced analysis solutions, in a native Windows environment. The user interface is updated to reflect the latest generation Windows look and feel, further enhancing usability and productivity. Femap is also highly integrated with Nastran, the industry's leading solver technology, to form a broad and comprehensive CAE solution.

Femap Version 9.1 includes powerful customization and automation tools among these new enhancements:

a fully associative interface with Solid Edge V18 for effective geometry transfer in preparation for analysis, maintaining data integrity between design and analysis;

a new direct CATIA V5 translator enhances geometry transfer for CATIA users; this new additional module strengthens the CAD integration capabilities in Femap, which already include CATIA V4, Pro/Engineer, SolidWorks, any Parasolid or ACIS geometry as well as STEP and IGES.

a fully integrated BASIC development environment, providing direct access to the OLE/COM API of Femap, as well any other OLE/COM-compliant application;

an updated macro-driven program file environment, with its own access window provides record, editing and playback functionality for the automation of repetitive tasks;

strengthened NX Nastran integration with a new linear surface-to-surface contact capability;

improved mesh quality around critical boundaries and stress raisers with a new quadrilateral element meshing option;

improved results visualization and collaboration with a new option to directly output JT files; and

support for Nastran spot weld elements to aid modeling of sheet metal component fasteners.

GNS brings computer modeling to drug development

* Key players: Colin Hill, CEO, president and chairman; James Watson, chief operating officer and chief financial officer; Iya Khalil, vice president of research and development and executive vice president; Jeffrey Fox, vice president of cardiovascular research; George Reigeluth, director of business development

* What does your company do? Gene Network Sciences (GNS) is a biosimulation company. Its technology is used to create computer models that help pharmaceutical companies improve the quality of the drugs they're developing.

* Why is this technology useful in the marketplace? Drug companies face staggering research-and-development costs associated with the production of new medications, says Colin Hill, Gene Network's CEO, president, and chairman. According to GNS , bringing a new drug to market costs an average of $800 million and takes about 12 years. And still, most drugs fail before they even make it to market, Hill adds. About 80 percent of drugs never make it through their clinical trials, according to GNS. Of the medications that actually enter consumer use, an average of just 60 percent provide therapeutic benefits to patients. Using computer models helps reduce the risk and uncertainty inherent in the drug-development process, Hill explains. "We're trying to make the approach to developing new drugs less haphazard," he says. "We're about making the whole process more predictive."

* How do these software models work? The company takes various genetic data, including information from the Human Genome Project, and uses it to help determine how drug candidates will interact with the body. The models can be used in a broad array of applications, but GNS currently focuses on development of cancer and heart medications, Hill says. The models can help pharmaceutical companies determine the effectiveness of their drugs, but also help them do safety screenings. In heart medications, for example, the software can be used to help determine the potential for cardiac complications.

* How does this process fit with traditional laboratory testing and clinical trials? "The companies still have to do lab testing for [Food and Drug Administration] approval, but why they work with us, is we can help make those efforts more successful," Hill says. GNS software can run millions of experiments on its computers in a fraction of the time and expense it would take to test similar predictions in a laboratory setting, he says.

* What is your background? Hill graduated from Virginia Tech University with a degree in physics and earned master's degrees in physics from McGill and Cornell universities. Hill says he has been involved in academic research focused around GNS's core technology for several years. Computational advances of recent years and knowledge gained through efforts like the Human Genome Project made the use of the technology possible in a commercial setting, Hill says. GNS was launched about five years ago.

* Is the move toward computer modeling a broad trend In the drugdevelopment Industry? Pharmaceutical companies are moving rapidly toward computer simulations, Hill says. "The costs and success rates aren't getting any better," he says. "The fact is that people are dying when better drugs can't be made or matched." The investment community, academia, and regulators are also driving the move toward simulation, he adds. "We think we can help make this process cheaper, faster, and more successful," Hill says.

* What kind of growth are you predicting? The company has annual revenue of more than $1 million, but Hill declines to disclose exact figures. The firm has raised more than $4 million from investors and has received more than $7 million in grants from the National Institutes of Health, the National Institute of Standards and Technology, and the Department of Energy. The company is predicting growth of at least 50 percent annually beginning in 2006. The company also plans to boost hiring in the next year. It plans to have more than 30 employees at its 13,000-square-foot headquarters by the end of 2006, up from the current 20.

* What are some of your recent projects? The company announced in July it won a Small Business Innovation Research Grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health. The six-month, $137,800 grant was aimed at further cardiacmodeling efforts, according to the company. In March, GNS announced a drug-development contract with Johnson & Johnson Pharmaceutical Research & Development, a division of Janssen Pharmaceutica N.V. The agreement involved the use of GNS technology in development of a pre-clinical oncology compound. Financial terms of that deal were not disclosed.

Thursday, September 06, 2007

Strategic price setting: ensuring your financial viability through price modeling

No one yet has figured out a formula or come up with a chart to tell hospitals when and how much to change their pricing. Tools that take the guesswork out of this essential but complicated task would be worth their weight in--well, increased revenues.

Without such tools, hospitals have to figure out pricing strategies on their own.

Years of reimbursement challenges coupled with skyrocketing labor and technology expenses have spurred hospitals to explore other avenues to enhance their revenue growth. Many are returning to a strategy that enjoyed more popularity a decade ago when fixed-fee payments were less dominant in the payer mix: strategic price modeling.


Compared with pricing strategies of years past, today's enhancement efforts are more limited in scope because the fixed-fee payment population grows every year. Still, pricing can be a highly effective strategy for hospitals in today's market. Through strategic, appropriate pricing, healthcare institutions can boost their revenue, fund greater access to capital, and remain financially viable.

It's not Wal-Mart

For the most part, research shows few differences in pricing behavior between not-for-profit and for-profit hospitals. And both types of hospitals struggle with the absence of a "magic" formula or industry standard to

Not long ago, many volume-hungry hospitals put marketplace issues before cost when determining pricing. But after years of tight margins and escalating costs--particularly in labor and supplies such as drugs, implants, and surgical devices--administrators are less pained at losing volume. Instead, they want higher revenues per unit, which they can achieve through strategic pricing. Hence, cost has regained its throne as king.

Getting an accurate sense of labor, supply, and other expenses is critical for a successful pricing strategy. That means determining the cost per procedure by DRG code, APC code, or payer, says Edward B. Carlson, vice president and CFO at Munson Healthcare, a regional not-for-profit healthcare system in Traverse City, Mich. Time studies, value units, and other basic cost-accounting techniques can help finance departments to drill down data to that level.

"You need some sense of what the relationship of cost-to-charge is at the procedure level, as opposed to just an overall ratio," says Carlson.

Charging Ahead with the CDM

Hospitals hoping to improve their revenue with pricing strategies often begin by gathering information. They start by turning a critical eye toward their chargemaster, or charge description master (CDM).

Keeping a CDM current and compliant can be a challenge for many organizations faced with constantly changing CPT codes and ever-evolving rules and regulations. But an accurate CDM and effective claims process are two of the hospital's most important allies, particularly under the CMS outpatient prospective payment system.

To ensure accurate claims and appropriate payment, the hospital must maintain its CDM. Ongoing maintenance of the CDM involves many steps, including:

* Eliminating rarely used or inaccurate codes

* Adding missing charges

* Correcting mismatched CPT and revenue codes

* Reviewing charges for accurate structure in the APC payment environment

* Making sure the CDM is compliant with all CMS regulations

Administrators at Seton Healthcare Network in Austin, Tex, have conducted strategic pricing studies for several years, aided by various software programs that help them determine where they'll get the best returns on pricing adjustments, based on their payer mix and market constraints. Douglas D. Waite, the network's senior vice president and CFO, is surprised by how many hospitals still don't follow similar strategies.

"Hospitals whose payer mix is very high in fixed payments from Medicare or Medicaid may not think this process is worth the investment in lime or money," says Waite. "Or they may have a lot of contracts that limit the size of their annual price increases or limit where they put those increases. But even for those hospitals, I would say it is worth the investment."

At Seton Healthcare Network, part of Ascension Health, administrators determine where they will get the best return on pricing opportunities on a charge-by-charge basis, not just a department-by-department basis.

"When we run the chargemaster, we might have 30,000 line items or more that we will review charge by charge with our department directors," Waite says. "We want to ask them if we will be pricing ourselves out of the market or encountering compliance issues if we change charges.



Modeling Software provides various analysis capabilities

Offering 32- and 64-bit support for Windows and Linux platforms as well as UNIX workstations, ALGOR v18 delivers linear static stress analysis, Mechanical Event Simulation, and fluid flow analysis capabilities. It supports display of results on isosurfaces; colored, 3-D streamlines; and particle tracking with size control. Able to export results to VRML files, software facilitates creation of fluid models and offers nonlinear material models support.

leading provider of design, analysis and simulation software, announced that its latest software release, V18, features expanded capabilities for linear static stress analysis (optimized for 64-bit operating systems), Mechanical Event Simulation (new Arruda-Boyce and Blatz-Ko hyperelastic material models and Mooney-Rivlin, Ogden, Arruda-Boyce, Blatz-Ko and Hyperfoam finite-strain viscoelastic material models) and fluid flow analysis (automatic modeling of the fluid medium and a new segregate steady solver for faster runtimes). New presentation enhancements include the display of results on isosurfaces, colored, 3-D streamlines and particle tracking with size control and the ability to export results to VRML files.

"The 64-bit optimization is part of a significant expansion of the hardware platforms supported by all of our ALGOR solvers, which when complete in 2005, will include 32- and 64-bit support for Windows and Linux and the first platform in our support for UNIX workstations. This expanded hardware support will allow users to analyze larger, more complex models faster than ever," said ALGOR Product Manager, Bob Williams. "Additionally, the new nonlinear material models allow for the consideration of a wider range of materials - especially rubbers and foams, while the new fluid medium modeling option allows CAD users to more easily gain access to our expansive suite of CFD tools."

"With the release of ALGOR V18, fluid models are now a snap to create," said Independent Contractor James A. Britch, P.E. "Model the structure in your 3-D CAD package, then select the fluid modeling option in ALGOR FEMPRO and simply define whether the fluid is internal or external. ALGOR then automatically creates the fluid model as a new part and matches the mesh to the structural elements."

"I have been using ALGOR V18 for a month and have been extremely pleased," said Marc A. Meadows, P.E., of Meadows Analysis & Design, LLC. "The 64-bit optimization is excellent. I look forward to the increased performance on my new 64-bit computer. The mouse customization is another great tool as I spend a lot of time switching between applications and this feature eliminates the need to change the operation of buttons and the wheel every time. I am also looking forward to the new VRML export options because my customers have requested this capability."

ALGOR V18 also includes:

CAD Support (Direct)

Ability to read surfaces and materials directly from SolidWorks

FEMPRO

Support for 3Dconnexion's advanced motion controllers (including SpacePilot, SpaceBall, SpaceMouse and SpaceTraveler)

Customizable mouse actions

Ability to match CAD mouse actions

Ability to select neighboring objects

Ability to zoom while snapping

Ability to create fillets between beams

Improved meshing capabilities

Part mesh matching

Ability to import 3-D DXF files

Mechanical Event Simulation/Nonlinear Solver

Ability to perform a draft motion-based analysis to verify linkages

Curve fitting for new hyperelastic material models

Ability to specify velocity and angular velocity throughout analysis

Smoothing contact parameters across different contact surfaces

Superview Results Environment

Contour plots for shell thickness

Factor of safety contour displays for beam elements

Ability to specify default display settings for new models

Ability to show node numbers for minimum and maximum values

To see a free learning session on the new ALGOR V18, view the "ALGOR V18" Webcast on the ALGOR web site. For more detailed information, contact an ALGOR account manager or visit the "ALGOR V18" features page.