Monday, December 6, 2010

Artificial Intelligence

Artificial Intelligence is a branch of science which deals with helping machines find solutions to complex problems in a more human-like fashion.This generally involves borrowing charecteristics from human intelligence,and applying them as alogorithems in a computer friendly way, A more or less flexible or efficient approach can be taken depending on the requirments established,which influences how artifical the intelligent behaviour appears.

Thursday, November 18, 2010

History of SEO

1995) the early days of yahoo.
optimization was born out of the roots of AAA,A#1,and Acme style yellow pages/white pages alphabetical optimizations.


1996) Blind luck and keyword seasoning to teaste.
the early days ware stabs in the dark using simple keyword seasonong.poke it here,and look for a reaction there.The first concepts of density and location started to be used.



l

SEO optimization

search engine optimization or seo pertains to the way you optimize a site in order to improve its ranking in search engine results pages.if you secure a place in the top 10 list of mahor search engines for keywords or key pharse which are related to you site.mpre people will visit your site and hopefully they will convert into customer.seo involues a lot of different techniques with millions of sites in tge world wide web.it is alomst impossible for people to know your site exits if you do not optimize it.it is therefore important to do several seo tech to make your site visible to your target market and beat the competition.....

Wednesday, November 10, 2010

CIAC security website

the CIAC website provides an extensive,comprehensive resource for diverse computer sercrity issue3s.these resources are presented in various forms and topics and are available to the public as well as the DOE community.

Distriubuted atack tools-undersanding them and defending against them"packet strom is the largest internet security tools database in the world.we provide intelligence ranging from security tolsto system defense and assessment information.


Wednesday, October 27, 2010

Cybernetics

"perhaps because the fields is still young there are many definitations of cybernetics,norbert wiener,a mathematician ,engineer and social philospher coined the word" cybernetics" from the greak word meaning steersman.

He defined it as the sciense of communication and control in the animal and the machine. amphare,before him,wanted cybernetics to be the science of govt.For philospher warren mcculloch,cybernetics was an observer and between the observer.


posted by madhu

JAYDAX CIC is a privately owend web business,based in the uk

As you can tell from our website title,this is an information site.we offer advice and assitanse with most related items.we aim to be informative,helpful and to some extent.
we are here because when we first started using the world wide web in the 1990s we had trouble finding the information and services we needed. The information was there,but we didnot know what to look for. since then the internet had grown hugly but the problem still remains.

Each day we discover new services,information and ideas which have become available on thewhen we spot them - we will pass them on. Did you know for instanse that you can look up web pages that are no longer on internet by searching for them in the internet wayback machine or that you can use google.



posted by amdhu.




Saturday, October 23, 2010

Linux Distribution

linux distribution runniong the KDE 4 desktop environment. Linux is afamily of unix-like computer operating system.Linux is one of the most prominent examples of free softwre and open source development: typically all underlying source code can be frely modified used and reditributed by anyone.The name "linux" comes from the linux kernal.started in 1991 by linux torvalds the system's utilities and librares usually come from the GNU operating system announced in1983 by richard stallman.The GNU contribution is the basis for the alternative name GNU/Linuxknown for its use in servers as part of the LAMP application stack,Linux is supported by corrporations such as Dell,Hewlett-packard, IBM , Novell oracle corporation Red hat Canonical LTD.

posted by madhu

Windows7

The latest client version in the microsoft windows line microdoft window is the colectivebrand name of serval software operating systems by microsoft.microsoft first introduced an operating system environment named by microsoft in november 1985 as an add-on to ms-dos in response to the growing interst in graphical user interfaces (GUIS).

The most recent client version of windowa 7 and windows server 2008 R2 which was available at retail on october 22.


mac OS X is a line of graphical operating system developed marketed,and sold by apple's primary operating system since 1984.unlike its predecessors mac OS X is unix based graphical operating system.



posted by madhu

Friday, October 22, 2010

Intel-Mcafee deal baffles security analysis

Several security analyisis today expressesd surprise that intel would purchase security tol maker mcafdee,nothing that at first glance the move makes litle sense for a pure hardwre firm.
Intel thursday agreed to buy mcafee for $7.68billion.which analysts are calling premium price tag.
At best he addes the acquisition could be a good venture capital investment6 for intel."and they may get a litle cross R&D benefit from the deal as well.

Intell president and CEO paul otellini,said this morning that the acquisition was driven by intel's belief that security has become a fundamental component of online computing.



posted by madhu


Google targeting apple ipad with chrome tablet?

Google android was always going to be the heart of many linux-based ipad like devices will be sold in paternership with version starting on nov 26th. That date is alredy engraved in every retailer heart as black Friday,the day after thanks giving and usually the biggest shopping day of the year.

After those nuggest of news,weverything else that's been written about the Google chrome tablet has been pure speculation.yet i can belive this core of his story.Goglr alredy worked with HTC to delivary one of the first android smartphone:the curently developer-only nexus one.

Sunday, July 4, 2010

Fighting Cybercrime on the Internet

This note is based on a presentation on cybercrime by Laura Taylor, TEC Director of Security Research for the E-Gov 2000 Conference sponsored by SAIC on July 10, 2000 at the Washington Convention Center.

Note: Portions of this note are excerpted from the presentation, other parts are explanatory text to relate this information to the Technology community serviced by the TEC web site. Information that was not taken directly from the presentation is in blue.

I am from a company called TEC, or TechnologyEvaluation.Com, a hybrid online destination site and research consulting company in Woburn, Massachusetts and Montreal, Canada. I have been working in the capacity of Director of Security Research at TEC for almost a year. Prior to TEC, I worked as Director of Information Security for CMGi's flagship webhosting company known as Navisite. Prior to that I founded a consulting company called Relevant Technologies, which still exists, and currently I maintain a position on the board. Before that, I was CIO of Schafer Corporation.

At TEC I manage the research of security technologies and vendors, identifying and qualifying key criteria necessary to assist high-level IT decision makers in making best-choice infrastructure investments. As well, I report and analyze current security news events, pointing out how these events affect you, your network, and your organization. As businesses continue putting their web-enabled e-commerce sites, and the jewels of their infrastructure online, the importance of security and privacy is becoming increasingly critical. What I plan on talking about today is "Fighting Cybercrime on the Internet."

My research is supported by 17 years of industry experience in the Information Technology field. There are three primary aspects of cybercrime that I will be talking about today: cyberpedophilia, keeping digital evidence pure, and mitigating white collar cybercrime. The other various security topics that I will touch on will have to do with how processes and procedures can support the management of these three important Information Age Law Enforcement and Public Safety concerns. The various security processes worth understanding include, "What are the basics for managing security in an organization? What security policies do you need? And who should you call to assist you in investigating and reporting cybercrime?"

Figure 1. Fighting Cybercrime

Why Should Businesses Be Concerned About Cyberpedophilia?

Criminals, including those involved in distributing pornographic material can use your website to promulgate their wares. Unless a business protects itself with firewalls, content filters, and risk management processes, it is vulnerable to penetration by these individuals for illegal purposes. If your website is used for illegal purposes, your company can be sued. Businesses are responsible not only for securing their websites against penetration, but also for insuring that the sites are not used for such illegal purposes as promoting pedophilia.

Before I start discussing how to manage cyberpedophilia, we need to first look at pedophilia in general, and understand how to identify it so that we can most expeditiously enlist the proper authorities, create processes for action, and work towards national and local solutions. As a general rule of thumb, behaviors that are illegal offline are illegal online, and obtaining a search warrant in part depends on one's ability to identify what constitutes illegal evidence. The U.S. Code, Title 18, sections 2251, 52A, and 56 are are the definitive laws that describe the sexual exploitation of children. Since part of the problem is the lack of understanding of these laws, I'm going to take the time to recite these important sections of our U.S. Code.

Section 2251 of Title 18 clearly states that anyone who meets the following requirements has participated in sexual exploitation of children: "Any person who employs, uses, persuades, induces, entices, or coerces any minor to engage in, or who has a minor assist any other person to engage in, or who transports any minor in interstate or foreign commerce, or in any Territory or Possession of the United States, with the intent that such minor engage in sexually explicit conduct for the purpose of producing any visual depiction of such conduct, shall be punished as provided under subsection (d)." And subsection (d) stipulates fined or imprisoned not less than 10 years. Section 2251 goes on to say that, "If such person knows or has reason to know that such visual depiction will be transported in interstate or foreign commerce, or mailed, if that visual depiction was produced using materials that have been mailed, shipped or transported in interstate or foreign commerce by any means, including by computer, or if such visual depiction has actually been transported in interstate or foreign commerce or mailed."

Parents, legal guardians, or anyone having custody of a minor, who "who knowingly permits such minor to engage in, or assist any other person to engage in, sexually explicit conduct for the purpose of producing any visual depiction of such conduct shall be punished as provided under subsection (d)." Schools need to be educated and informed about the dangers online, because they too are accountable and responsible for mitigating these dangers.

How Does This Relate to Web-hosting Providers?

If we take a look at Section 2252A of Title 18, it becomes clear that a web-hosting provider who knowingly possesses child pornography on a company owned hosting server, even if it is by contractual arrangement with a customer, can be held liable. From having worked at several web-hosting companies, I can assure you that today, most web hosting companies do not realize their liabilities in this area. 2252A states that accountable persons relating to child pornography constitutes "any person who knowingly mails, or transports, or ships in interstate or foreign commerce by any means, including by computer, any child pornography;" or any person who "knowingly receives or distributes child pornography that has been mailed, shipped, or transported in interstate or foreign commerce by any means, including by computer."

Title 18, Section 2256 contains explicit definitions which apply to pedophilia, and cyberpedophilia. In that section, it clearly states that "visual depiction includes undeveloped film and videotape, and data stored on computer disk or by electronic means which is capable of conversion into a visual image." It should be noted that "sexually explicit conduct" includes both gay, and straight sexual acts. In fact, there are many responsible gay adults who are adamantly abhorrent of some of these man-boy love web sites and would welcome the opportunity to help assist in getting them removed from the web.

At this point, Ms. Taylor went on to discuss computers and children, noting "Keeping children off the web, and off computers is not an option. In fact, we need to enable online access as much as possible, in order to enable our kids' survival as law-abiding contributing members of society."

She further explained that "in the online world, Pedophiles do not have to expose themselves as adults to have access to kids, and usually don't. Cyberpedophiles hang-out in online chat rooms, and typically pose as children themselves this is one of the reasons cyberpedophiles are so successful. They pretend to be kids, and do not get picked up on anyone's radar screen as a possible threat. So let's take a look at some of the kinds of online dangers that threaten our nations greatest treasure, our children."

Possession of child pornography is a crime. In 1996, the Child Pornography Prevention Act (CPPA) was instituted specifically to combat the use of child pornography using computer technology. Often some of the servers that these illegal images are published on also contain chat rooms which can be used to entice a one-on-one online chat with a minor.

Many webhosting companies do not even realize that they are hosting child pornography servers. Busy webhosting companies sometimes barely have enough time to answer the telephone. They sell the online publishing process, but often have no knowledge of the content that is being published. Many pornographic domain names are purposely esoteric so as to avoid scrutiny of law enforcement and the general watchful eye of the public. How many people here have ever taken a look at Whitehouse.com? Whitehouse.com is often the first stop for viewers looking for the Whitehouse website before they realize that they need to use the .gov extension and type in Whitehouse.gov.

Figure 2. Cybersafe Portals Need to be Protected

Though webhosting companies are usually compliant with law enforcement in resolving child pornography issues that come up, they are not content examiners, and as far as they are concerned, auditing content for illegalities is not a cost effective way to spend their resources. In fact, one of the biggest problems in combating online child pornography is the wide differences that exist in international standards and laws. When you call up a website, or domain name, the viewer does not know where the site is being hosted, nor does the viewer care. When a site is hosted by a country that does not view child pornography as illegal or objectionable, who's laws apply - the country where the server is located or the viewer's home country? On which side of the world do you put in place the technology and content filters? Who are the authorities that you should contact to help resolve pedophile webhosting sites and illicit chat rooms?

Authorities

Ms.Taylor went on to discuss cyberpedophilia as it relates to home, school, and library computers with information and guidance for parents, educators, and librarians, stating that "Part of the plan needs to be teaching children how not to become cybercriminals when they grow up. Waiting until bored technology savvy teenagers start perpetrating denial of service attacks on websites critical to our nation's economy and safety is waiting too long to teach kids online netiquette."

Process for Action

So how do we accomplish all this? What is our IT Agenda? Well there's lots of work to be done. Janet Reno's proposal for LawNet to bring states together to help fight cybercrime is an excellent concept. While state attorneys general are working on developing a framework for LawNet, it is important to involve technologists at an early stage to make sure the regulatory objectives are in alignment with the proper network technology. A large-scale technology network of any kind requires complex project management with built-in work-flow, escalation thresholds, and centralized management. If setup correctly, processes built into LawNet could expressly manage certification of cybersafe school portals. The FDA regulates what kind of food we give our children in school cafeterias. Shouldn't we have an organization that institutes and enables minimum requirements for online safety? Schools need to know which portals are safe to use. A cybersecurity vision that works for our schools should be scaleable and centrally managed. Imagine the overhead and unnecessary costs if every single school in America needs to install their own firewall and content filters.

Ms. Taylor went on to discuss "Securing the schools of America from cyberthreats," further noting that "It's a complicated technological problem that needs to be mapped strategically to the education, security, and law enforcement objectives of a greater national technology vision."

This was followed by a detailed discussion of the issues involved for schools, parents, and law enforcement stating that, " This new child protection law applies to all children under the age of 13 and requires that website operators contact parents and get their verifiable consent to their children's participation in one-on-one communication systems, chat rooms, or online pen pal programs. Who is enforcing this new child protection law?"

Have any websites been cited for violations of this new online child protection law? How can we find out which companies and organizations have violations in this area? Online advertising companies are notorious for collecting all kinds of personal information about online users through the use of what is known as a web-browser "cookies" as well as online question and answer forms. If not architected appropriately, an online search engine may see searches done by a 10 year old girl with the keywords "girls" and "toys" and instead return sites with adult sexual paraphernalia. Once kids get into the 6th, 7th, and 8th grades, they will assuredly on their own put profane and explicit language in search engines just to see what happens. We need to understand which sites are appropriate for which ages and grade levels.

Conclusion

Just because you run a business with a web-site doesn't mean you can ignore cyberpedophilia. Awareness will cause you to take the proper precautions and ensure that the vendors you employ also take the necessary steps so that cyberpedophilia doesn't find your site a welcome host. All adults have a responsibility to protect children.

This discussion is only a beginning, setting the need for businesses to be aware of the problem and their potential liability. In future articles on this web site, Ms. Taylor will discuss the following:

  • How business can protect themselves from cybercrime (especially cyberpedophilia)


  • How not to contaminate the evidence, when a cybercrime has been detected


  • How to effectively manage the security of your IT systems

For a transcript of the full presentation, e-mail your request (with your e-mail address) to

Demand-driven Versus Traditional Materials Requirement Planning

Nowadays, manufacturers are increasingly subject to massive pressures to drive down costs and increase efficiency. However, these pressures often invalidate the traditional materials requirements planning (MRP) batch-based manufacturing planning and product costing approaches. Moreover, companies struggling to serve their customers using purely traditional MRP methodology are often unable to meet the demands for agility and responsiveness that consumers at the end of the supply chain are requesting. To make things worse, with product life cycles decreasing, it means that manufacturing and distribution are increasing in complexity. For the manufacturer, this translates into a need to better manage customer demands and expectations, and to respond accordingly.

With ever shorter product life cycles in fashion due to shifts in buying trends and marketers attempting to best guess what fickle consumers will desire from season to season, the supply chain cannot afford items that sit on shelves (tying up capital and facing obsolescence). Whether these items are finished products, components, or subassemblies becomes irrelevant in this scenario. Therefore, there is a real need to reduce inventory throughout the supply chain nodes. For example, carmakers tend to see an increase in supplying customized products, with consumers able to specify everything from the color pallet for bodywork and interiors to choosing the latest electronic gadgetry for the dashboard. One can even witness customization creeping into mass-produced goods too, where certain versions of items of wide ranging types are being made available through different outlets—may it be food, fast moving consumer goods (FMCG), etc.

Needless to say, the need to reduce capital employed within the manufacturing enterprise and the trend of outsourcing manufacturing to lower-cost regions overseas, which typically increases lead times (which customers do not appreciate), only complicates the conundrum of low costs and increased efficiency for embattled manufacturers. This means that customer management has to move up several steps of the efficiency ladder, with the best companies staying very close to their customers.

In reality, the way to overcome these difficulties and better serve customers might often be to adopt demand-driven manufacturing principles. For example, manufacturers could make product as close to the point of order as possible, anticipating needs, and delivering within an acceptable time frame. Demand-driven manufacturing is not a new concept. Japanese manufacturers went down this route back in the 1980s. Now the rest of the world is slowly waking up and moving away from batch production towards demand-driven manufacturing, because wavering demands from customers create too many demands on the inflexibility of traditional manufacturing methods.

It is old news that optimizing within the four walls of the factory is no longer a workable solution, and while outsourcing may be seen as important in lowering the price of finished goods, it causes further problems by increasing lead times in a world where decreasing lead times are equally necessary to satisfy the customer.

Further, manufacturing anything—from mobile phones and computers, to cars and toys in the discrete manufacturing segments, to meat processing, producing paints, and brewing beer in the process manufacturing segments—can entail extremely complex business processes. Namely, parts or ingredients are needed to make components, which in turn, need to be configured or assembled before a final product can be delivered to a customer.

This is Part One of a two-part tutorial.

Part Two will discuss demand-driven planning.

Scheduling and Forecasting

Alternatively, the manufacturing process itself can be rather straightforward, but is subject to difficult scheduling requirements, due to long lead times and fluctuating market demand. The APICS Dictionary defines demand as a need for a particular product or component, which could come from any number of sources. This includes customer order or forecast, an interplant requirement, or a request from a branch warehouse for a service part or to manufacture another product. At the finished goods level, demand data are usually different from sales data because demand does not necessarily result in sales. For example, in the sales scenario, if there is no stock, there will be no sale. Generally, there are up to four components of demand: cyclical component, random component, seasonal component, and trend component.

On the other hand, demand management is the function of recognizing all the demands for goods and services to support the market place, which involves prioritizing demand when supply is lacking. Proper demand management facilitates the planning and use of resources for profitable business results. In other words, it integrates supply and demand information so as to optimize operations. Forecasting applications, which predict activity over a weekly, monthly or even yearly time horizon, remain central, but demand management is a broader activity that can include replenishment, sales and operations planning (S&OP), integration with marketing, order, and customer resource management (CRM) systems.

The traditional time-series forecasting approach the averages of the past performance of a demand stream to anticipate further demand, but more sophisticated systems take into account factors beyond historical demand, employing statistical methods to remove biases. These more complex forms of forecasting determine and predict the effect of "causal" or "event-driven factors," and macro-economic indicators, might have on demand. To that end, recent advances in forecasting have focused on gauging the impact of pricing and promotions, product introduction and obsolescence, intermittent demand, and product proliferation, while forecasting accuracy is improved through collaborative processes that allow sales and distribution channels to work interactively with forecasters.

Either way, planning and controlling this enormous flow of processes and information requires sophisticated software. Adding to this complexity is the distribution of manufactured goods to market, since many variables come into play, such as lead-times; customer orders; internal orders and inventories of products; components; and raw materials. Many decisions must be made, such as when to re-order components or parts, how much inventory to keep, and so on.

It was not until the late 1970s, when computers begun being used in the manufacturing process, that some of these complexities could be mitigated. Solutions such as re-order point (ROP) systems and MRP have been the most common tools to plan for when, and how much, of a certain component or part should be re-ordered. To refresh our memory, a ROP system is an inventory method that places an order for a lot, whenever the quantity on hand is reduced to a predetermined level, known as the reorder point. On the other hand, master production schedule (MPS) is the anticipated build schedule for those items assigned to the master scheduler. It is a set of planning numbers that drives MRP, and it represents what the company plans to produce, expressed in specific configurations, quantities, and dates.

Finally, MRP is a set of techniques that uses bill of material (BOM) data, inventory data, and MPS to calculate requirements for materials, to make recommendations to release replenishment orders for materials. Further, because it is time-phased, it makes recommendations to reschedule open orders when due dates and need dates are not in phase. For more definitions, see Glossary of Enterprise Applications Terminology.

Impact of Computer on Planning Process

The impact that the computer had on material planning and enterprise management in the 1970s was immense. From manual planning to the huge inventory of posting card decks, this new computerized system promised to automatically plan, build, and purchase requirements based on the finished products to be shipped, the current inventory on hand, the allocated inventory for other orders, and the expected arrivals. The posting, originally done manually on input/output cards, was replaced by transactions directly made in the computer and documented on pick lists. The amount of inventory was supposedly visible to anyone with access to a computer, and did not require the user to go to the card deck.

Hence, MRP represented a huge step forward in the planning process. For the first time, based on a schedule of what was going to be produced, which was supported by a list of parts that were needed for that finished item, the computer could calculate the total need and compare it to what was already on hand or committed to arrive. This comparison could then suggest an activity to place an order; cancel orders that were already placed; or simply move the timing to expedite or delay existing orders. The real significance of MRP was that, for the first time, the planner was able to answer the questions "what?", "when?", and "how much?". In other words, rather than being reactive and waiting until the shortage occurred, the planner could be proactive and time phase orders, including releasing orders with multiple deliveries. Indeed, the enterprise systems currently in use by most large corporations worldwide are an evolution of the MRP systems, which had managed to regiment former chaotic manual systems, to a degree.

Nevertheless, some simplifying assumptions were needed to allow the computers of the day to make the required calculations. One was that the orders should be started at the latest possible date to provide for minimal inventory while still serving the customer's need on time. This method is referred to as backward scheduling. Therefore, all orders were scheduled backwards from the desired completion date to calculate the required start date.

Also, there was no inherent slack time in the schedule. The downside of this assumption was that if there were any hiccups in the execution of the plan, the order would most likely be sent to the customer late. Further, if only one part required for the finished product was going to be late, there was no automatic way to know the impact on the other needed parts.

The result of the MRP run, which can take several hours in some environments, is supposed to tell planners how to organize their work by releasing production orders. MRP will, by default, create orders with specific due dates for products and when they need to be manufactured. Companies prioritize resources based on these calculated due dates. The unfortunate result is that other orders, perhaps more important, are neglected, which often leads to overtime in the factory. Therefore, slack would often have to be built into the schedule through conservative, often unjustifiably pessimistic lead times. Despite this drawback, the benefits of the system far outweighed the costs and more companies began to embrace the tools and techniques of MRP. For more information, see Enterprise Applications—The Genesis and Future, Revisited.

MRP Limitations

Combined with information from actual customer orders, MRP is still the most widely used tool in manufacturing industries to track, monitor, and order the volumes of components needed to make a certain product. However, for these reasons, many manufacturing environments have discovered that MRP has trouble controlling stock levels, which results in poor delivery performance. Also, MRP is incapable of handling demand-driven, ever-changing manufacturing, working better when demand for a particular product is constant (fairly even) and predictable. If there is any variation, however, then MRP loses many of its advantages and the benefits of using alternative planning approaches increase.

In fact, the main flaw with MRP is that it is too deterministic. It is too rigid, as it does not allow for natural variations that occur in real life, such as people getting sick or going on strike, truck and shipment delays, machines malfunctioning, quality issues requiring scrap or rework, and customers not order according to forecasts. In other words, MRP is a static model of a stochastic reality, which happens to manufacturing all the time, and is based on customer orders, available parts, and so on. MRP attempts to apply a high degree of precision to something that is inherently imprecise.

Again, MRP is a system that strives to plan replenishment just before a withdrawal from stock, which does not work in some manufacturing environments. In the language of logistics experts, MRP is a "push" system, which schedules production based on forecasts and customer orders, and thus creates plans to push materials through the production process based on forecasts that, by nature, are not accurate. That is to say, traditional MRP methods rely on the movement of materials through functionally-oriented work centers or production lines, and are designed to maximize efficiencies and lower unit cost by producing products in large lots. Production is planned, scheduled, and managed to meet a combination of actual and forecast demand, and thus, production orders stemming from the master production schedule (MPS) and MRP planned orders are "pushed" out to the factory floor and in stock. External suppliers also work to support planned production, while materials management often relies on maintaining sufficient inventory, using a make-to-stock (MTS) rather than make-to-order (MTO) or assemble-to-order (ATO) approach.

What’s All This Benchmark Stuff, Anyway?

In the world of high performance computing, everyone wants to know how well a system performs before deciding to buy it. Benchmarks provide a relatively objective way of determining how well a system will perform under given conditions. What customers need to know is: which benchmarks are relevant to their particular needs, and which ones don't matter? In this article, we will go through some of the more common benchmarks used, and discuss in which areas/markets they are most important.

One caveat: although most benchmark tests are a reasonable attempt to simulate real-world conditions under which systems may be expected to operate, there will always be some deviation between the tested configuration and a customer's actual computing environment. Thus, benchmarks should be thought of more as guideposts than actual "scripted scenarios". However, benchmarks can (and often do) serve as a tool for comparison between different systems. It is the comparative aspect of benchmarking which provides the most useful information.

Benchmarks examined here will focus on hardware systems such as servers, desktops, and notebooks. Although loose Operating System (OS) comparisons can be made, doing so is more complex, and can give misleading results. Loose comparisons can also be made between some applications (e.g. comparing a system running Oracle 8i to one running Microsoft's SQL Server to one running IBM's DB2), but as with OSes, can be misleading. However, trends can sometimes be assessed through judicious use and analysis.

Brief History and Background

For a long time, the only benchmarks to which anyone paid attention were related to the CPU. In the 1980s, people (especially salesmen) were fond of quoting how many MIPS (Millions of Instructions Per Second) a computer could perform. This was more meaningful when the bulk of computers were CISC (Complex Instruction Set Computer or Computing), a group that includes IBM-compatible personal computers.

With the advent of RISC (Reduced Instruction Set Computer or Computing) machines, measuring the number of instructions executed became an apples/oranges comparison, with the conflict akin to religious warfare. In addition, the lines between CISC and RISC have become more blurred. This conflict led to the need to develop benchmarks more focused on system performance than component performance, as well as to provide more refined performance figures for the CPU and CPU subsystem.

Another attempt at meaningful benchmarks was FLOPS (FLoating-point OPerations per Second), which rated processor speed. Computer manufacturers often quoted their systems as "XX megaFLOPS" (MFLOPS) or ""YY gigaFLOPS" (GFLOPS). However, as users came to realize that FLOPS is an incomplete measure of system performance, other benchmarks were developed by groups such as the Standard Performance Evaluation Corporation (referred to as SPEC), a consortium of industry vendors who joined together for that purpose.

As non-mainframe servers and personal computers have proliferated, both in actual numbers and quantity/type of application they are called upon to run, more tests became necessary. The benchmark tests developed were more focused, and thus more applicable for a particular type of situation. For example, a test designed to measure how fast/well a humongous database query can be executed through a server is not suitable for comparing 3D graphics performance of mechanical CAD workstations.

What we now find is that, in addition to industry/consortium-originated benchmarks, tests are developed by non-vendor groups. A key example is the ZDNet eTesting Labs a.k.a Ziff-Davis Media Benchmarks (formerly known as Ziff-Davis Benchmark Operation [ZDBOp] and the suite of benchmarks they have built. "ZD" is Ziff-Davis, publisher of computer-related magazines such as PC Magazine, PC Week, and PC Computing. ZDBOp currently provides and oversees more than ten benchmark tests. These tests are primarily PC-based, but also include tests for Macintoshes, servers, and Internet performance.

The other key "group" providing benchmark suites is the individual vendors. Companies such as Oracle, SAP AG, Microsoft, and Lotus/IBM provide benchmarks for their specific products. As with the wider-focus tests, hardware manufacturers sometimes use these tests for competitive selling. Some provide the tests to potential customers to help them decide how much computer they will need to order.

Benchmarks

So, what are the benchmarks in current use, and what do they measure? Listed in Table 1 below are some of the better-known benchmarks, along with the kind of performance factors they measure/evaluate. This list is not all-encompassing, but it does list many of the benchmarks most users will find valuable and useful.

Note: To access the benchmark details click on the view box in the Details column.

Table 1.

Test Name
Segment
Synopsis
Metrics
Details
TPC-C System Measures transaction processing performance and exercises all related subsystems tpmC, $/tpmC
TPC-H System Measures ad-hoc transaction performance QphH, $/QphH
TPC-R System Measures performance of a standard set of queries QphR, $/QphR
TPC-W System Measure transactions (e.g. e-commerce) for a business-oriented web server WIPS, $/WIPS
SPECweb99 System Updated version of SPECweb96. Measures peak throughput for web serving Conform. simul. connections
SPEC CPU2000 CPU subsys Measures CPU performance (replaces SPECint/fp 95) SPECmark
SPECsfs97 System NFS file server throughput and response time Ops/sec; Overall resp. time (ORT)
SYSmark98/
SYSmark2000
Desktop Overall general application performance, incl. office productivity and content creation SYSmark rating
SYSmark/32 Desktop 32-bit application performance SYSmark rating
SYSmarkNT4 Desktop Measures performance across a mix of applications (CAD, word processing, spreadsheet, ProjMgmt, presentation) SYSmark rating
i-Bench Internet Performance, capability of Web clients Various
WebBench Internet Web, proxy, and cache server software performance Score: rps
Thruput: byte/sec
NetBench Server File server's handling of 32-bit clients' I/O requests

Thruput: Mb/sec Response: msec

Winstone Desktop Overall 32-bit application performance Winstone units
Business Desktop Application suite performance Winstone units
High-End Desktop Applications for demanding users, e.g. multimedia (NT only) Winstone units
Content Creation
(CC)
Desktop Content creation (e.g. Photoshop, Director) performance Winstone units
Winbench Desktop Graphics and disk subsystems performance Many, see table
3D Winbench Desktop 3D subsystem, incl. graphics, S/W Frames
/second
PC WorldBench 2000 Desktop, Notebook System and applications performance WorldBench score
BatteryMark Notebook Battery life when running Windows applications Life:
minutes
Web Polygraph Server (appliances) Measures performance and value of caching server appliances. Thruput:
rps
MRT: sec
Price/perf:
rp s/K$
WebStone Client/Server Measures throughput and latency of HTTP transfers Thruput:
Mb/s
Peak:
Conns/sec
VolanoMark Server Measures Java Virtual Machine (JVM) performance Unitless score
DirectoryMark Server Measures LDAP directory server performance Ops/sec;
resp. time
VENDOR-BASED
MMB Server/Client MAPI Messaging Bchmk, measures throughput actions of a "Medium User" profile, executed over an 8-hour day MMB
SAP Server/Client Performance of system while running SAP R/3 # of users, Response time
Oracle Server/Client Performance of system while running Oracle 8i User count; Response time
NotesBench Server/Client Performance while running Lotus Notes, used to size servers Throughput; Response time
RETIRED
SPECweb96 System Measures peak throughput for web serving (still searchable) Ops per second
see website
www.spec.org
SPECint95 CPU subsys. Measures CPU integer performance w/main memory SPECmark
SPECfp95 Workstation Measures CPU floating-point performance
AIM Windows NT Server General performance; now defunct
AIM Unix Server General performance; now defunct
ServerBench (retired) Server Performance of application server hardware and OS in a client/server environment TPS

Which Ones to Check?

Customers need to know which benchmarks to use for comparing various applications. Shown below is a correlation table of some standard tasks to various benchmarks. A check mark means that the particular benchmark is a good indicator of how well a given system will perform the required task(s).

Figure 1. CORRELATION TABLE - Application vs. Benchmark


Pitfalls

Benchmarks can be misapplied in various ways.

One way uses the benchmark to provide "competitive data/analysis" for task or tasks which bear no relation to what is really important. An example of this might be the attempted use of a CPU performance benchmark as an indicator of system-level performance. As the saying goes "Many a slip 'twixt cup and lip"; for us, this means that while CPU performance and system performance are often linked, the link may be tenuous and not necessarily usable for comparison.

Another popular misapplication is to test a high-performing but non-useful configuration, then present the results as if customers would actually get the measures performance. An example of this might be to test a disk-laden system configured for RAID0 (high-performing but no data loss protection), when the typical customer wants/needs RAID1 or RAID5 (data loss protection but lower performance).

Finally, sometimes vendors will quote unaudited benchmark data. Although this is often innocuous, what can happen is that a vendor will implement special features or "tune" the System Under Test (SUT). This can make it perform better than is realistic, and certainly better than a competitor's untuned system. We have seen at least one vendor pull figures from its Website because the performance figures quoted were theoretical not real-world, and definitely not audited. While we commend the vendor for "doing the right thing", we believe the figures had no business being there in the first place.

To help you combat BMS (Benchmark Misapplication Syndrome), here are some questions you can ask of hardware vendors:

  • Have these benchmark figures been audited by (name of auditing organization)?

  • Why is the particular benchmark you're quoting applicable to my circumstances/needs?

  • How realistic is the benchmark configuration you tested?

Summary

As we have seen, benchmarks fill an important role in selecting a computer, from servers to notebooks. As with many things, benchmarks can be used for good or evil. Judicious use of performance data, combined with an understanding of what work/tasks you want your system to perform, will help reduce or eliminate a lot of the hype surrounding performance data. Don't be afraid to put the vendor on the spot regarding the figures quoted. Asking a few hard questions now may save a lot of work later, including the effort of buying another system - because the one the vendor sold you doesn't quite perform as well as they told you.

IBM’s NetVista Joins the Appliance PC Fray

BM [NYSE:IBM] announced the availability of its new NetVista all-in-one and legacy-free computers, two new devices designed to simplify the computing experience. Fewer cables and smaller sizes make them easier to carry and set up. New drives, keyboards and "Access IBM" buttons make them easier to use. New networking and security features make them easier to do e-business.

With the optional IBM Portable Drive Bay 2000, the NetVista computers allow users to easily transfer data between computers. This means you can reduce the number of hard drives or CD-RW drives with a single, swappable drive that works in both ThinkPad notebook computers and NetVista desktop computers.

Features such as the embedded Security Chip, available on select models of the legacy-free NetVista S40, provide 256-bit encryption for extremely secure network and Internet transactions.

Market Impact

This year has seen a trend toward slimmed-down PCs as product offerings. Hewlett Packard [NYSE:HWP] and Compaq [NYSE:CPQ] already have similar business appliance PCs; Dell [NASDAQ:DELL] and Gateway [NYSE:GTW] have pitched their appliances to the consumer market.

Here's where the entry level NetVista stacks up:


Processor Hard Drive Price
IBM NetVista S40 Celeron 566 10 GB $699
HP eVectra D9898T Pentium III 600 8.4 GB $999
Compaq iPaq legacy free Celeron 500 4.3GB $499

IBM has a good mix of features and values. For $200 more than Compaq's offering, you get:

  • Larger hard drive

  • Faster processor

  • E-mail and application suite (Lotus Notes client & Lotus SmartSuite Millennium)

IBM also includes free deployment and migration tools such as the System Migration Assistant, which collects user settings and data from an old PC and transports them to a new NetVista. The security features are a plus in highly secure environments, such as banking, but they do require additional software to be useful.

One minor point - IBM needs to address navigation on its NetVista eCommerce site. A simpler grid summarizing the differences among all the NetVista models would be helpful. When a site visitor sees "All in one from $1,799.00", but the "compare models" link brings up a grid which shows no models priced below $2,099, it's a little confusing.

User Recommendations

Organizations still committed to Lotus SmartSuite, or to a hybrid ASP/Terminal Server-hosted application paradigm, are going to have a hard time beating the S40 on price/ performance, particularly for general office workers who need only a browser, e-mail, word processing and spreadsheets. If Microsoft Office is a "must-have", the feature gap between the NetVista S40 and Compaq's iPaq narrows, but not decisively - since the iPaq doesn't include any application software in its base model.

We reiterate our prior points about appliance PCs - they may be cheaper to build and buy, but it remains to be proven if they are cheaper to deploy and support. Nevertheless, the NetVista is in the top tier of appliance PCs debuting this year.

ERP: Origins, Developments, and Trends

istory teaches everything, including the future.
—Lamartine

The names for new technology systems continue to change, but the promises they make remain the same: improve the bottom line. This is the first installment of Back to the Basics, an intermittent series that will unearth the core definitions of buzzwords and key application systems, and chart their evolution. Understanding their evolution is essential to knowing their current use, future developments, and upcoming trends—and more importantly, for making informed decisions.

Enterprise resource planning: noun. An accounting-oriented information system for identifying and planning the enterprise-wide resources needed to take, make, ship, and account for customer orders .

—from the APICS Dictionary, 10th edition

Enterprise resource planning (ERP) systems started as a means for inventory control and grew to replace islands of information by integrating traditional management functions, such as financials, payroll, and human resources, with other functions including manufacturing and distribution. Currently, the complexity of business is creating new user needs; the growth of computers is developing new potential; the quest for new markets by vendors has given users a new voice; and ERP is evolving once again. Names and acronyms like extended-ERP, ERP II, enterprise business applications (EBA), enterprise commerce management (ECM), and comprehensive enterprise applications (CEA) are being tossed about, but what's really going on?

Adapted from Enterprise Applications—The Genesis and Future
series by PJ Jakovljevic

Evolution

In the 1960s, the key goal of an ERP system was inventory control. Manufacturers assumed consumers would continue their buying patterns and aimed to keep enough inventory on hand to meet demand. The sophistication of resource planning grew with the affordability and feasibility of the computer. In the sixties, computers were large, hot, noisy machines that occupied entire rooms, but by the seventies, average manufacturing companies could finally afford them. The innovation computers allowed caused management to review traditional product cycles and resource allocation. Materials requirement planning (MRP) computer systems were developed to promote having the right amount of materials when needed. First developed by IBM and J I Case, a US tractor maker, MRP promised to automatically plan, build, and purchase requirements based on the finished products, the current and allocated inventory, and expected arrivals. Master production schedule (MPS) was built to monitor the finished goods. Naturally data from MPS fed into the MRP, which contained phased, net requirements for planning the procurement of sub-assembly components, raw materials, and ingredients.

MRP gave planners more control, allowing them to be proactive and use time phased orders, rather than reacting only when delays occurred. However, because of the limitations of computers at the time, the software could handle only limited variables. There was no way to see how a late part, for example, would impact overall production. The general assumption was that delays in the system would mean the customer would receive the product late. Also, backward scheduling, where the start date was calculated backwards from the desired completion date, had to be employed to minimize inventory and still meet the customer's delivery date.

Determining the quantity of parts needed to complete the order, however, was not enough. Companies needed to create capacity plans based on materials, equipment, and priorities to improve efficiency. Thus capacity requirements planning (CPR) emerged. Unfortunately, again due to the limited capabilities of computers, variables such as idle time, maintenance, and labor could not be fitted into the CPR equation. Thus each work center was assumed to have an infinite capacity—a problem that still plagues manufacturers today. Scheduling and planning still remained imprecise. As a result, the need to factor in other resources became apparent.

This need moved beyond the shop floor. Keeping financial tabs on the coming and going of inventory, the labor and overhead involved, and the revenue generated from the delivery was also necessary. Manufacturing resource planning (MRPII) attempted to integrate business planning, sales, support, and other functions together so they could work in concert.

By the nineties, each functional area also saw the benefits of computerized tracking and planning. With computers being more common and affordable and programming more sophisticated, each department could use its own software program. Unfortunately, that was the problem. Disparate systems and different databases were not linked and the need for integration became obvious. Moreover, the time to market for consumer goods decreased sharply because of consumer demand. This combined with new, Japanese manufacturing philosophies, meant that western enterprises had to re-evaluate their manufacturing processes. Just in time (JIT), which aimed to eliminate waste and material lag time, meant that suppliers and manufacturers had to develop closer relationships. Also, labor exploitation caused cost of goods sold (COGS) to shift to purchase materials. Planners needed to know the cost of material allocations immediately after orders were placed, but buyers purchasing raw materials need to know the sales plan months in advance. A common database had to be developed: enterprise resource planning was born.

Revolution?

Currently, the goal of integrated ERP is to replace islands of information with cross-communication to ensure enterprise-wide coherency. Though ERP promises quick access to information, it is still plagued with problems it inherited from MRPII: assumptions of infinite capacity, and inflexible scheduling dates. However, ERP can be purchased as a product. Vendors now offer broad functional coverage nearing best-of-breed capabilities; vertical industry extensions; and strong technical architectures. This, combined with product enhancements, global support, and technology partners, is narrowing the gap between desired and actual features.

Traditionally the biggest purchasers of ERP solutions have been large Fortune 100 companies, however, the surge of IT investments in the nineties dropped in 1998. It continued to fall until 2000, and has not yet reached the same numbers. As a result, vendors are now looking to increase their market share by meeting the needs of small and medium businesses. However, entering a new market is not enough to build a strong repertoire. What will truly differentiate the leaders in this industry is the breadth, depth, and diversity offered at the plant level, and the ability to meet the requirements of distribution centers. Furthermore, planning functionality will have to extend from the shop floor to distribution centers. This includes flow-based manufacturing, work instruction, dynamic dispatching, and other elements. Web-based, service oriented architecture will also have to be factored in. New systems will also have to be more customer-focused, incorporating e-commerce interaction and collaboration with business partners.

ERP "extension" software is also in demand. Users want comprehensive functionality from advanced planning and scheduling (APS), manufacturing execution systems (MES), to sales force automation (SFA). As a result, broader customer relationship, business intelligence (BI), business-to-business (B2B) and business-to-commerce (B2C) functionalities are being included. These features need to be integrated, and ideally, "one-stop-shop" offerings should synchronize and integrate releases.

To meet the integration needs of users, all major, traditional ERP players have begun moving into the areas of supply chain management (SCM) and customer relationship management (CRM). For example, in 2004 it was reported that SAP's SCM revenue outpaced industry leaders i2 Technologies, Ariba, and Manugistics. Incorporating SCM functionality may be a way to circumvent MRP II's capacity planning limitations. Furthermore, APS, a subset of supply chain planning (SCP), allows users to create a feasible schedule using identified, finite constraints. Finite capacity creates simulations and allows the user to analyze the results prior to committing to the action. SCM also addresses the need for enhanced information flow among customers, suppliers, and business partners outside of the enterprise. The concept of global logistics was created by combining APS with specialized warehouse and transportation management solutions. Thus the global supply management chain linked suppliers and user companies and encompassed all processes, including initial raw materials, to the consumption of finished goods. Yet, while SCM and its offshoots promise to improve some of the deficits in ERP, it will not be a replacement. No matter how responsive a supply chain execution (SCE) system is, it still functions on the premise of waiting for a problem to occur, then acting on it. This is just as flawed as relying on unyielding plans and never obtaining feedback. They are both needed for an enterprise to be productive.

Product lifecycle management (PLM) too may seem to be a rival system to ERP, perhaps more so than SCM. PLM solutions are oriented around creative product innovation processes, whereas ERP is transaction oriented. Furthermore, PLM stand-alone packages accommodate collaboration better than ERP. However, the market is up for grabs and PLM vendors need to focus on easy integration with ERP in order to stay competitive. Likewise, if ERP vendors continue to develop extended functionality, collaborative capabilities, accessibility, and integration by incorporating universal interfaces and Web services standards, then PLM's current market superiority will be noticeably diminish.

In addition to SCM functionality, vendors are also encompassing front-end CRM functionality in their ERP solutions. CRM itself has gone from a vast field of point solutions to suites of customer care applications covering SFA, field service, telesales, call centers, marketing automation, etc. Additionally, e-business is also being factored into the ERP equation, as is real time performance integration. Currently, ERP systems rely relational databases, which are good at retrieving small numbers of records, but do not effectively retrieve large numbers of records nor do they summarize them on request. To compensate, vendors are now increasingly embracing on-line analytical processing (OLAP) which provides CFOs, CEOs, and other decision makers with tools providing high-level, aggregated views of data.

Consequently, to meet these needs, ERP vendors have partnered with other vendors or are finding other means to create solutions. As a result, two things are happening. First, the lines between SCM, CRM, and e-commerce, and ERP are being blurred. However, they still remain crucial areas, with their own specific functions and needs. Second, many users feel oversold on ERP solutions. Thus, when evaluating ERP systems, decision makers need to be all the more careful to determine and rank their priorities to ensure they don't become saddled with functionality that they will never use. Ultimately, to make the best of an ERP solution, all the functional models should have access to and should use the same data in near real time, unless all the processes are fully integrated. Users also need the ability to move seamlessly among modules. Otherwise, information will pool in certain areas, and will either be disconnected or, at best, loosely connected to other areas.

Written by TEC staff writer with files

About the Back to the Basics series

The names for new technology systems continue to change, but the promises they make remain the same: improve the bottom-line. Back to the Basics is an intermittent series that will unearth the core definitions of buzz words and key application systems and chart their evolution. Understanding their evolution is essential to knowing their current use, future developments, and upcoming trends—and more importantly, for making informed decisions.

EVALUATION CENTERS

* + Software Type

o

Software Type
+ Accounting
+ Asset Management
+ Business Intelligence (BI)
+ Content Management System (CMS)
+ Enterprise Resource Planning (ERP)
+ Human Resource Management (HRM)
+ Product Lifecycle Management (PLM)
+ Project Portfolio Management (PPM)
+ Relationship Management
+ Supply Chain Management (SCM)

* + Industry

o

Industry
+ Aerospace
+ Business Services and Consulting
+ Chemical Products
+ Computer, IT, and Software
+ Construction
+ Education
+ Electronics and High-tech Components
+ Engineering and Architecture

+ Finance and Banking
+ Food and Beverage Products
+ Health Care and Social Work
+ Insurance
+ Manufacturing
+ Mining and Quarrying
+ Motor Vehicles
+ Oil and Gas

+ Petrochemical Manufacturing
+ Primary Metal Manufacturing
+ Publishing and Media
+ Telecommunications
+ Textile and Apparel
+ Transportation and Warehousing
+ Utilities
+ Wholesale and Retail Trade

* + Business Area

o

Business Area
+ Back Office and Operations
+ Data Management and Analysis
+ IT Management and Development
+ Production and Distribution
+ Professional Services and Support
+ Sales and Marketing


For inquiries contact us
Get help now

COMPARE AND ANALYZE

* Evaluation Centers
* Software Evaluation Reports
* RFP Templates
* Decision Support Systems (DSS)
* Software Selection Process

RESEARCH

* Articles
* Newsletter Archives
* Research Reports
* Blog Posts
* Podcasts / Webinars
* White Papers
* Case Studies
* Vendor Showcase
* Technology Directory
* Ask the Experts
* Live Events

CORPORATE

* About TEC
* Careers
* Meet our Analysts
* Brochures
* Contact us