Thursday, October 31, 2019

Twelve Angry Men Essay Example | Topics and Well Written Essays - 1250 words

Twelve Angry Men - Essay Example It is about the group dynamics of the jury and how they change throughout the movie. To start with, the group comes from a wide range of backgrounds and beliefs, but what is more important is how they view the purpose of their task. Most want to just â€Å"get it over with† regardless of the outcome. Because it does not affect their lives in any significant way, they do not apply much critical thought to the evidence. Instead, they assume because the police and courts are prosecuting the young man, he must be guilty. Thankfully for the defendant, one man, Juror #8, uses critical thinking and takes the instructions from the judge seriously. Twelve Angry Men can be divided into five sections of group development. The first stage, known as â€Å"forming,† begins the dynamic and usually involves working out of purpose, structure, and leadership. In the movie this part of the group development is portrayed at the beginning of the jury deliberations. Juror #1, the jury forema n (Martin Balsam), is ready to start and seems unclear on how to proceed. He clearly demonstrates that he is not really a leader type. He politely asks two of the jurors to have a seat so they can get started without seeming the least bit managerial. Then when the men assemble around the juror’s table, the foreman hesitantly discusses the various ways to proceed. He says he is not sure which is best and readily accepts the suggestion of one of the other men, a much more authoritarian type, that they take a vote so they â€Å"can all get out of there† (Henry Fonda). The foreman readily concedes and the vote is eleven to one in favor of guilty with Juror #8 (played by Henry Fonda) being the holdout. One of the more extroverted jurors says, â€Å"Boy oh boy, there’s always one,† which seems to imply that Juror #8 is only voting not guilty to cause trouble, gain attention, or for some reason other than the fact that he truly believes the defendant is not guil ty. The juror who implies this accusation acts passively aggressively to bully Juror #8. He wants Juror #8 to feel like everyone is against him, so that he will change his vote and then they all can â€Å"get out of there.† Yet, he does not come right out and say it directly. This leads directly to the next stage of group development, â€Å"storming.† Storming involves intergroup conflict and disagreement over who should be in control of the group even if it is not blatantly exerted. Juror #10 (played by Ed Begley) challenges the authority of Juror #1, the jury chairman, and Juror #3 (Lee J. Cobb) tells Juror #2 (John Fiedler) "to keep silent." Both Jurors #3 and #10 intervene when Juror #9 (Joseph Sweeney) wants to give his opinion. Then, Juror #6 (Edward Binns) physically threatens Juror #3 because he does not think he is showing Juror #9, who is the oldest of the group, due respect. Another instance that reveals the personalities of the group occurs when Juror #11 ( Georg Voskovec) says, â€Å"I beg pardon. To which Juror #10 says, "I beg pardon? What are you so polite about?† And, Juror #11 answers, â€Å"For the same reason you are not: it's the way I was brought up† (Henry Fonda).  This clearly demonstrates that there are vast differences in background and personality in the group. From the revelation of these differences and likenesses, as with any group, small cliques begin to form. â€Å"Norming† is this clique forming stage and occurs when the group begins to develop close relationships among its members. Most of the group participants are encouraged to participate. In Twelve Angry Men, even the more silent members of the group (Jurors 2, 5, 6) were encouraged to contribute their opinions to the discussion. During norming, groups will generally demonstrate cohesiveness, yet in the movie, total unity never quite develops. In

Tuesday, October 29, 2019

Major Contributions Made By Ancient Egyptians And Babylonians To Essay

Major Contributions Made By Ancient Egyptians And Babylonians To Science - Essay Example The ancient Egyptians were masters of the arts of stone working and metal working and the production of faience and glass. Their products were used throughout the ancient world. Their understanding of astronomy was very advanced, and this knowledge was passed on to the generations that followed. In mathematics, they developed basic concepts in arithmetic and geometry. The ancient Egyptians understood the idea of fractions and knew how to add them. Some of the mathematical texts taught the finer points of arithmetic, geometry, and even word problems, and are not unlike modern primers. These and other texts indicate that the ancient Egyptians understood and could add fractions and could even find the area of a trapezoidal pyramid. Without the advanced mathematics they originated, the ancient Egyptians would not have been able to build the pyramids and other large structures (Encarta, 2005). Egyptian scholars wrote some of the earliest known medical texts. These texts deal with topics such as internal medicine, surgery, pharmaceutical remedies, dentistry, and veterinary medicine. Medical papyri taught physicians how to deal with both internal medicine and surgery (Encarta, 2005). Ancient Egyptians doctors were the first physicians to study the human body scientifically. They studied the structure of the brain and knew that the pulse was in some way connected with the heart. They could set broken bones, care for wounds, and treat many illnesses. Some doctors specialized in the field of medicine, such as eye defects or stomach disorders (Lesko, 1989).

Sunday, October 27, 2019

Effect of Standard Pricing Changes on Firm Operations

Effect of Standard Pricing Changes on Firm Operations The Rise and Fall of Standard Pricing and Its Effect on Everyday Operations For European and American Firms Table of contents (Jump to) Executive summary General overview Accounting overview Literature review Standard pricing as accounting practice Operations management Operations life cycle Continuous improvement Core value systems Discussion and conclusion References EXECUTIVE SUMMARY The overall purpose of this paper and study is to investigate cost or lean accounting within the operations management realm and how its unpredictable rise and fall allow organisations to continuously learn and utilize knowledge management as a core value. It was also important to use a larger organisation that has history of outstanding operations and customer centered focus upon services. This investigation will require an in-depth study of work processes, communication and leadership with regard to knowledge management as a value within the team construct while looking at how this reflects leaning accounting principles. What tools are available and what kind of evolution is Nestle undergoing in order to remain competitive in a changing economy? How does this change knowledge management and communication company wide? What this study argues is that accounting practices are changing due to the evolving business plan. This is a movement toward modern accounting and it is important to see the relationships between costing accounting, its fluctuations and how they impact the health of the organisation as a whole with regard to productivity and job satisfaction. How an organisation applies methods of costing into its framework for accounting of expenses and its direct rise and fall over the time period of the product life cycle, directly influences the production, operation, distribution and employee retention of the global company. In fact changes in accounting practices have led to many tried and true business models to no longer exist. Costing and its rise and fall can have a direct relationship with success and competitive advantage in the market place. However the purpose of this study is to explore and reflect upon how accounting practices change operations management and the supply chain management model as a tool of managers and team members alike. Really it is how accounting practices have changed business practices because of new legislation focusing on global companies in Europe and the United States. Accounting costs, expenses and losses reflects the health of the organisation and with change comes confusion. This study argues th at with such changes comes a lack of defining the company’s value within the market but also the value it has for its employees, as they become active participants and investors. GENERAL OVERVIEW How corporate accounting is handled is changing worldwide. How each expense is accounted for within an organisation’s financial sheets has been evolving. Such a proposal for change has received much commentary from not only the financial community and corporate America but also key members of Congress, European union leaders and the public. Such a response results from the uncertainty that such change will benefit businesses and economic growth. It is feared that such change will have the opposite effect and cause world leaders to lose its competitive edge in the global market. Still this has not stopped the fuel of the fire as the American Financial Accounting Standards Board (also referred to as FASB) has struggled for an answer to such a dilemma. The urgency for a solution has only been stressed recently in light of such debacles like Enron and Tyco. It is believed that companies do need to account honestly for expenses but at what price to its employees, the public and the economy? Part of the issue with current legislation to change the practice of accounting for employee stock options is that there is no real way to value their worth. This creates an unsettling feeling among investors and employees struggling to understand this benefit. ACCOUNTING OVERVIEW What this truly means for any corporation functioning globally or even locally this that effective cost accounting because a volatile issue for management to consider. One could argue that such rise and fall of how costing/pricing pays a part in the entire operation has a negative effect upon how the company’s valuation is seen on the open market. Costing at every step of the product life cycle plays a huge part in how this valuation is decided from inventory at the shop floor level, to everyday operations management, to an employee’s value with the company and their net worth personally. Changes within the global economy in the recent years the disappearance of tried and true business models leaves many with a poor taste in their mouths because one must understand how efficiency, affordability and effective leadership come into play. Effective pricing or costing of routine operations and corporate behaviours must be tracked and studied in order to carve the fat. This s tudy aims to look at exactly what the rise and fall of pricing or costing means to a global organisation conducting business on many levels. For the purpose of proving the argument that such changes in accounting practice has a negative effect on the organisation, one will look at examples from the shop floor to the employee’s estimated value with the company in the form of job satisfaction. Accounting for such expensing and pricing correctly is what makes the organisation strong but also accurate in valuation. With this in mind, traditional business models like Wal-Mart and Nestle are discussed because these are globally operating corporations. Debates about whether or not the fair value of the employee and the company stock options should be expensed on the income statement continue to rage among industry representatives, politicians, and pundits. Expense recognition of stock options can have significant impacts on net income and earnings per share, so this is a debate worth having. But many of those who analyze companies consider operating cash flow a better performance metric than income. One reason is that operating cash flow is thought to be free from the infection that makes income grossly weakened. In the case of employee worth and stock values, however, there is proof that this assumption is flawed. Option exercise affects operating cash flows in ways that analysts need to understand. Repurchasing shares to fund option exercise also results in financing cash outflows. The net cash flow impacts of options are often negative, but can be quite volatile from year to year. LITERATURE REVIEW STANDARD PRICING AS ACCOUNTING PRACTICE It can be difficult to assess why a product has a certain cost or price to the consumer. How is it that companies arrive at certain amount for a product or service? What are the factors that play into this amount and do they change over time while in the market? Mish defines clearly, price as being â€Å"the value or worth; the quality of one thing that is exchanged or demanded in barter or sale for another† (2004, p. 985). A mistake that happens to many companies is they allow the market to manage the price of the product and avoid strategic management of pricing in general. What is usually done according to Nagle is â€Å"they list the prices based on their own needs and then adjust transaction prices to based on what customers say they are willing to pay. Only a few companies question why someone is willing to pay no more that a particular amount or how that willingness could be changed† (2002, p. 1). In order to be strategic in pricing, a company must confident and understand that â€Å"pricing involves managing customers’ expectations to induce them to pay for the value they receive† (Nagle 2002, p.1). Fortunately, when it comes to financial products, many customers remain in the dark about product and services. Sometimes a service oriented company such as the Bank of England can take advantage of such undulation but as more information becomes available due to the Internet, it is becoming increasingly more difficult for a company to set the pace this way. More than not, more companies especially financial ones that rely on customer relationships, allow for a value-based price structure that is contingent on the customer paying when value is delivered. This type of pricing system relies heavily on segmentation of the demographic when it comes to offering promotions and incentives to buster customer loyalty. Much of this applies to financial type products that are well defined for the consumer either through education or these pr oducts are a must in life like the credit or loan product. Keeping this in mind, many financial products consist of high quality products and add-ons that when offered by one company allows that company to diversify and establish the price. The table here below aids in illustrating this point. Table 1: Pricing Strategies (Anderson Bailey 1998, p. 2) It is also important for a company to keep in mind demand for the product or service. This is why diversification and globalization are quickly becoming elements of strategy as companies look for new ways to target consumers and enter new areas where their original product has a new life cycle. This is a matter of economics but important for understanding marketing strategy with regards to cost switching or price switching. â€Å"The greater the price elasticity, the closer the company can price products to similar competitive products and vice versa† (Allen 2002). In an industry like the mortgage industry where homeownership is more prevalent in Western nations, elasticity is high and therefore, it is fair to remain competitive with other companies. Also a company like Nestle can bet that charging less may lead to more food products created as customers find they get more service for less money. In this respect elasticity can work either way. It really depends on degree of ri sk one company is willing to take. Still it remains to be found if such a tactic even works when it comes to customer loyalty, as this will be explored in greater detail later. However, it remains to be seen if price loyalty does exist. It seems â€Å"the key to effectively competing for loyalty is ensuring the quality of the customer experience, not the quantity of customer rewards or discount prices† (Compton 2005, p.1). However, the price needs to be adjusted for what the customer expects. It can be a cycle that changing continuously depending on the product or service. Carmona, and et al (2004), writes of the origin of activity based costing method of accounting or ABC that came into vogue in Europe during the 1920s. What ABC does specifically as Carmona, and et al (2004) speak of Vollmers’s work as: Deployed significant efforts to account for distribution and marketing costs, which ‘tend to be ignored today.’ This first event is then taken as record of the origin (both in terms of time and space), from which the new practice mainly spread both temporally and spatially. (p. 36) This is the start of a movement toward the double entry system and this saw delay and many weaknesses because it did not present a clear, complete picture of accounting. Its weaknesses were found in inefficiencies with charges and discharges. As a result, early double-entry systems were seen as unreliable and not useful to big business. It would not be until later that advanced book keeping procedures would take into account advanced operating processes in production. Carmona, and et al (2004) found these systems although not perfected were used in England and the Colonies as early as 1760 (p. 37). It seems this was the trend as no real streamlined, conforming system would be adopted until modern business practices came into place in the United States. Move to a global arena and model of production purposes and a more refined system is needed because a lot more is at stake. Global business is all about the details. It became common practice more investment applied, the more generally accepted accounting practices became as a diffusion of new technology. Accounting practices became more generally accepted behaviours as businesses became bigger and more prominent in communities across the world. Practices are implemented as Abu-Raddaha, and et al (2000) surmises the following: The information provided by accounting should facilitate international trade and capital flows, not hamper them. It should inform, not just report. More importantly the information demands of both domestic and international financing and other commercial relationships, have to be satisfied. (p. 19). Everything must remain in balance or presented as a well-oiled machine. How does an organisation get to this point of transformation with its accounting practices? Modern accounting asks for more participation and optimisation from the start to finish by the corporate accountant. The actions of the corporate accountants must change as the movement toward lean functioning continues to take place. It should not be a painful process but one of creativity, flexibility and growth. There is a concern that lean accounting requires one to turn off creativity and be boxed into one function or thought process. This will be explored late as a post-modern viewpoint of business where each person has a function within the total quality management or TQM perspective. Modern business may use this as a framework but the modern business model has evolved beyond this fixed view. The truth of the matter is that modern accounting practices could not be further from this view of being boxed in but rather goes beyond breaking the box and creating a different mindset where thin king is seen differently than before. Accounting is seen differently as not having finite possibilities but infinite reasoning. Traditional methods are flawed as proposed by Van Der Merwe and Thomson (2007), â€Å"the direct costing approach doesn’t absorb any overhead or even fixed costs†¦resource consumption accounting or RCA makes no arbitrary assignments at all† (p. 29). A lean, effective method allows for a more detailed account of capacity costs and a basic approach to data collection. Modern times call modern values and thought processes with regard to business seamless behaviour across the production floor. The lean method maintains a â€Å"one-touch flow system† (Van Der Merwe Thomson 2007, p. 29) for information diffusion across the life cycle. This one-touch flow system can be integrated with a supply chain easily and reflects this value added element as a method for better, honest accounting. OPERATIONS MANAGEMENT A most important factor for facilities management to recognize is the use of Total Quality Management (TQM) or a variation of TQM. TQM according to David Steingard is â€Å"a set of techniques and procedures used to reduce or eliminate variation from the production process or service delivery system in order to improve efficiency† (Steingard 2002, p. 2). TQM fits with the facilities management way of doing things as many of their functions require repetition or constant monitoring of daily, weekly and monthly items. Because this is a modernist concept and the modernist movement believed in certainty and static methods of looking at the world, there is not much room for the uncertainty that change creates in today’s workplace using strictly TQM. Therefore either change in this environment must be controlled change or a variation of TQM must be used for the process to work and involve new technologies. Otherwise, TQM alone invents a work environment reminiscent of Franz L ang’s Metropolis and dehumanizes the employee. A variation TQM can be used in facilities management to aid defining team member responsibilities as it sees the whole team as a â€Å"machine creates a system of interlocking parts each with clearly defined use, centralized authority and high degrees of worker discipline culminating with the goal of routinised, efficient and predictable system performance† (Steingard 2002, p. 2). Each team member plays a role in the functioning of the machine. Still much like today’s business environment where change is constant, this system requires continued adjustment, modification improvement of function. TQM as way of defining a work process cannot operate entirely in today’s global market because it succeeds at the expense of innovation and the growth of the employee. It also does not leave room to incorporate change and new ways of improving functions. Still a memory of pure TQM feeds the â€Å"modernist machine of c onsumer capitalism which encourages over-consumption, planned obsolescence, ecological damage and depletion of natural resources† (Steingard 2002, p. 4). This memory has also burdened management as the obsession for perfection, control, consistency, productivity and efficiency increases over time. In today’s facilities team, there must be a healthy medium to not only use past methods for increased productivity and efficiency but also to include modern tools and equipment to make the job easier. In order to remain competitive, technology cannot be ignored, the systems it provides must be implemented in order for logistics to remain seamless and keep up with demand and customer expectation. For instance failure to embrace logistics and technology results in inventory costing a company more money to store than it is worth. McCullogh writes, â€Å"Right now sitting around the globe is a bunch of inventory (worth an estimated) United States $1 trillion—United States $1 trillion of boxes of stuff is just sitting around a warehouse† (‘Warning: Don’t Snub Logistics’, p. 1). This has the potential to represent about 60 percent of the average company’s working capital. This is capital in limbo that is not maximizing its investment potential. A sign of successful shop floor operations is reliance on very little warehousing. In other words, warehousing is measured as the amount of days per month a product sits in the warehouse and if logistics is implemented effectively, this number will decrease and stabilize. The retail average storage of inventory is 26 days of investment not being utilized, profit being lost and daily expenses being incurred in an endless holding pattern. In order to reduce the amount of days inventory sits means companies must create tighter relationships with suppliers via the web or perfect a system of communication between resources to cut out warehousing all together. Instead of inventory remaining stored because of wireless communication and data collection, the product can go straight from the supply source to the retailer’s shelves via a distribution centre that acts much like mail sorting centre. This can work because technology enables a retailer to send data immediately to the supplie r of products that are moving off the shelves with a click of a button. From this electronic message, the supplier knows what the retailer needs, what products are popular, how much and sends then instantly to the retailer’s distribution centre. In organisations the size of Nestle or Wal-Mart, logistics strategy requires much forethought and planning, as there are many branches and divisions that are involved in the process. The idea is to reduce expenses and increase value to the organisation by making the company more productive and efficient. This needs to be done as seamlessly as possible to continue brand loyalty and customer relations while maintaining market share and competitive advantage. In many ways, implementation of this strategy creates a delicate balance. In order to have better Business to Business or B2B relationships, one must understand the connection. Robert Thierauf and Hoctor (2003) explain, â€Å"B2B is about connecting shared businesses and information processes of the extended trading networks, planning, shipping and logistics, inventory management and customer retention to name a few† (Thierauf Hoctor, p. 181). In other words, an optimized planning process can save millions dollars and allow a multination corporation to carry out its objective and gain market share. This means applying advanced technology such as i2 used by Dell Computers and typical ERP vendors. In today’s act of doing business, B2B exchanges are based on supply chain management or SCM technologies (Thierauf Hoctor, 2003, p. 182). This will mean considerable investment in such technology but the benefit of market share will prove it to be a valued investment over the long-run (Burn Hachney 2002; Scerbo 1999). Running these centres effectively certainly poses a challenge of management. Manufacturers must develop new skills and confront channel conflicts with dealers, distributors and independent operators. Leaders in these positions must have an understanding of managing the conflicts in these channels. But well-managed distribution centres would more than justify the risks, as it would save the organisation a significant amount of overhead. With operating expenses as the main cost, it is possible to make the distribution venture essentially self-funding. Facilities can be rented on short-term leases and surrendered if the location isnt successful within a year or two. The cost of goods and labour can be managed as volume grows. Companies should remember that a manufacturers original warranty work usually accounts for about half of the labour expenses and for as much as 20 percent of the total value of services rendered, but these costs are typically charged back to the business unit rather than borne by the company’s distribution. In markets poorly served by local dealers or other distributors, for instance, a centre should gear itself to its company’s end users or consumers by choosing a high-traffic retail site. Profits at these locations are generated largely through the sale of accessories and optional services to walk-in or mail order customers; outlets thus need appealing product displays or sales pr esentations. Different kinds of retail distribution centres pursue different economic models. Although gross margins on sales to end-users are higher, orders tend to be smaller. Locations that focus on distributors can achieve scale faster and be just as profitable. The largely similar economics of service centres vary only according to which customer segment is best served at each location. Companies run their own centres and tie management bonuses to profit and growth goals at each site. Either way, some support functions, such as marketing, human resources and information and financial systems, are best managed at the corporate level. Warehouses and distributions centres are caught in a squeeze between customer service demands and cost drivers. The challenge for most organisations is create a network that can deliver on customer demands while keeping costs down. This is the number challenge in supply chain management. Supply chain management presents a huge undertaking when it comes to overhead operating costs. Many of the tools have come down in price because usability has gotten easier. As a result, more and more companies are adopting a supply chain management philosophy for distribution and are re-evaluating its effectiveness every two years as opposed to before at every five years. Management members are interested to see if the efficiency of the centre matches its service level provided. Research has found a direct relationship between the number of distribution points, transportation costs and customer service targets. The network and its design are driven by improvements so that the cost of transportation can be offset. This may include reviewing an organisation’s transportation arrangements. Loading patterns should also be examined to find ways to cube out containers and trailers (Trunick, p. 1). What possibilities can be used to have a cost-effective outcome for the distribution centre? Does this mean consolidating shipments or a move to parcel and less than truckload shipments? Can shipments be combined to make greater use of truck cargo space? Can the organisation hire rail or air as better shipping alternatives to using company trucks over longer distances? In addition to examining loading can the routes used by the trucks be adjusted to be add to cost-efficiency? An organisation would benefit using their state’s transportation management system or a department of transportation, DOT to map out distribution volumes and patterns. This would help in providing dynamic routing options that can be flexible to change distribution needs in the network. This can benefit the fleet by reducing fuel supply needs and help control costs and usage. These efficiencies would result because the routes would decrease in mileage and also wear/tear on the vehicles and insurance costs. Efficiency inside the four walls of the distribution centre can also be improved. Relatively speaking the size of the average distribution centre has grown from 300,000 square feet to one million square feet (Trunick, p. 2). This is simply due to operating space needed to move inventory from point a to point b. But the real reason the distribution centre is larger today is mainly because organisations have seen the need to put all operations under one roof. By putting multiple facilities into one larger distribution centre improves the time it takes to transport inventory. Still the larger centre is made possible because of improved transportation systems but also implementation new technologies that not only enhance a brick and mortar store but also a virtual one. Plus, the organisation has the manpower under one roof. The company only rents one building and keeps the inventory in one place rather than moving it from warehouse to warehouse. This allows the company to provide better service to the consumer. Because of these factors, information systems are critical to the success of the larger distribution centre. Data has the need to travel from one area to another and that is why more and more companies are investing in radio frequency terminals both handheld and vehicle mounted. Investment of these RFID systems is not inexpensive and many retailers such Wal-Mart and Target are looking for ways to enrich the present technology and systems without implementing a whole new infrastructure into the walls of the centre. By being able to enhance present systems proves to be cost effective because not only is an upgrade cheaper but also it is easier to train employees to run. It is a company’s ability to effectively handle investment of new technologies that allows the centre to run better. Still as Trunick writes, the concern is not found in hardware but in data. â€Å"Databases have traditionally been structured to feed a number of different systems, but that’s not a long term architectural solution† (p. 2). Part of the problem a distribution centre faces with data storage is being able to provide the data in real time and allowing the data to remain clean and not crowd. As a result many companies are searching for better solutions than using RFID in supply chain management. It has not proven to be productive in the distribution centre setting not like 8 percent in the warehouse setting (Trunick, p. 2). One new technology that was introduced to the Nestle facilities management team in 2006 was the use of a computerized tracking system for client user orders. This system was implemented to better track the status of job orders among the team members. This system acted to alert a team member of potential deadlines and current job load status. It also allowed management to better track individual and team progress. This resulted in a monthly recognition program to signify when quotas had been met or when a team member received a client user compliment. This system also had the capability to record the negative such as being late to a service call or failing to complete monitoring of weekly items for inspection. The system would then e-mail the team member and the direct supervisor if such conduct occurred (Facilities Training Group 2007, p. 11). This system replaced the old process of â€Å"tracking† client user orders that consisted of logging each order into a spiral notebook. With the advent of the company’s intranet site, management hoped to improve communication between the facilities team and the client user by offering an electronic request system. This would reduce the amount of time the facilities team spent fielding phone called requested and allow for multi-tasking of various jobs. What management had hoped the system implementation would result in, did not happen mainly due to team member lack of communication and resistance to change due to a pre-existing TQM elements within the old process of handling client user orders. Management had hoped as the Business Open Learning Archive details, â€Å"automation would exploit available technology to speed up operations, make them more reliable and to reduce unit costs and their risks and costs. This would bring flexibility to the system already in practice† (Operations Technology 2005, p. 1). This type of new technology or just-in-time or JIT technology requires careful handling and extensive training. What facilities management team leaders had not prepared for was the team member response. Many of them despite being competent, responsible employees did not have knowledge of computer systems. Many of the team members had been with the company over twenty years and had been hired to the division. Many of thes e types, fall into the category of being older but also having a specific specialization in which they were in the field most of the time (Facilities Training Group 2007, p. 24) not requiring any other extensive skills. Another factor management had not anticipated was a considerable language barrier as many team members who had worked together for years, continued working in their native tongue of Spanish. A final aspect of the mixed response for the team had more to do with timing than anything. Management provided a three-day training session and then allowed two weeks for the new system to be adopted. The transitional period was too short and was met with much resistance from many members of the team. Many did not accept the change or completely understand the new system. Many did not check their email or use the tracking component. Finally, despite company wide advertisement of the new online request feature, most client users did not use it and continued to phone in requests. This resulted in not a decrease in time spent on the phone but due to the new system’s lack transition and rejection by some of the team, the group received three times as many calls in one week (Facilities Training Group 2007, p. 33). The team had to hire a temporary employee to aid in taking calls while team leaders provided on the job training and supervised walk-through of the new process. The period of six weeks it took the team to get ba

Friday, October 25, 2019

Its Time for Americans to Understand that Freedom Isnt Free Essay

It's Time for Americans to Understand that Freedom Isn't Free I feel inspired and patriotic every time I see a car’s back bumper sticker featuring an American flag stating, â€Å"Freedom Isn’t Free!† The moral clarity of those words rings as true as the Liberty Bell. Those Americans that do not fathom the significance of the motto Freedom Isn’t Free suffer from the very problematic â€Å"victim/slave mentality,† which ultimately will become a future reality should more citizens not heed the simple message the sage language conveys. Yes it indeed bears repeating, â€Å"Freedom Is Not Free!† Its acquisition from King George’s England involved struggle, its maintenance throughout the first two and a quarter centuries of our Great Republic required sacrifice and its continuation demands perseverance. Wise people fully realize that struggle, sacrifice and perseverance are the vital characteristics of freedom, democracy and independence. In the late 1930s complaisant European nations were lulled into the jaws of the very dangerous â€Å"victim/slave mentality.† Weak democracies tried placating and accommodating the tyrannical proponents of the Communist, Socialist and Fascist ideologies and Europe soon found itself in jeopardy with maniacs like Stalin, Hitler and Mussolini threatening the existence of taken-for-granted freedom and human rights. Thanks to the intervention of the United States Hitler and Mussolini were defeated (despite incredible adversity) and Europe was salvaged from the scourge of Fascism. But Nazi Fascism did not go away meekly. Its defeat required intensive struggle, sacrifice and perseverance with over 50 million military and civilian deaths occurring during the widespread devastation. Yes, during War World II the social a... ...civilians with ordinary families. Their wicked goal is to see Manhattan, Washington DC, Chicago, Philadelphia or Los Angeles totally obliterated. We cannot allow this to happen even though our solution might endanger the rights of â€Å"suspected war criminals† that think your neighborhood is their battlefield. If the anti-war â€Å"victim/slave mentality† should ever become the majority opinion in America, then the lyrics of the rock group Kansas would become prophetic truth, â€Å"All we are is dust in wind!† Let’s be wary and vigilant and not permit horrific catastrophe to happen! Wake up all you American Apologists while you still have precious breath in your lungs to do your pathetic apologizing! It’s now time for all Americans to openly acknowledge that Freedom Isn’t Free and that these dire times require the tried and true virtues of struggle, sacrifice and perseverance.

Thursday, October 24, 2019

History of Computer Virus

THE HISTORY OF COMPUTER VIRUSES A Bit of Archeology There are lots and lots of opinions on the date of birth of the first computer virus. I know for sure just that there were no viruses on the Babbidge machine, but the Univac 1108 and IBM 360/370 already had them (â€Å"Pervading Animal† and â€Å"Christmas tree†). Therefore the first virus was born in the very beginning of 1970s or even in the end of 1960s, although nobody was calling it a virus then. And with that consider the topic of the extinct fossil species closed. Journey's Start Let's talk of the latest history: â€Å"Brain†, â€Å"Vienna†, â€Å"Cascade†, etc. Those who started using IBM PCs as far as in mid-80s might still remember the total epidemic of these viruses in 1987-1989. Letters were dropping from displays, crowds of users rushing towards monitor service people (unlike of these days, when hard disk drives die from old age but yet some unknown modern viruses are to blame). Their computers started playing a hymn called â€Å"Yankee Doodle†, but by then people were already clever, and nobody tried to fix their speakers – very soon it became clear that this problem wasn't with the hardware, it was a virus, and not even a single one, more like a dozen. And so viruses started infecting files. The â€Å"Brain† virus and bouncing ball of the â€Å"Ping-pong† virus marked the victory of viruses over the boot sector. IBM PC users of course didn't like all that at all. And so there appeared antidotes. Which was the first? I don't know, there were many of them. Only few of them are still alive, and all of these anti-viruses did grow from single project up to the major software companies playing big roles on the software market. There is also an notable difference in conquering different countries by viruses. The first vastly spread virus in the West was a bootable one called â€Å"Brain†, the â€Å"Vienna† and â€Å"Cascade† file viruses appeared later. Unlike that in East Europe and Russia file viruses came first followed by bootable ones a year later. Time went on, viruses multiplied. They all were all alike in a sense, tried to get to RAM, stuck to files and sectors, periodically killing files, diskettes and hard disks. One of the first â€Å"revelations† was the â€Å"Frodo. 4096† virus, which is far as I know was the first invisible virus (Stealth). This virus intercepted INT 21h, and during DOS calls to the infected files it changed the information so that the file appeared to the user uninfected. But this was just an overhead over MS-DOS. In less than a year electronic bugs attacked the DOS kernel (â€Å"Beast. 512† Stealth virus). The idea of in visibility continued to bear its fruits: in summer of 1991 there was a plague of â€Å"Dir_II†. â€Å"Yeah! â€Å", said everyone who dug into it. But it was pretty easy to fight the Stealth ones: once you clean RAM, you may stop worrying and just search for the beast and cure it to your hearts content. Other, self encrypting viruses, sometimes appearing in software collections, were more troublesome. This is because to identify and delete them it was necessary to write special subroutines, debug them. But then nobody paid attention to it, until †¦ Until the new generation of viruses came, those called polymorphic viruses. These viruses use another approach to invisibility: they encrypt themselves (in most cases), and to decrypt themselves later they use commands which may and may not be repeated in different infected files. Polymorphism – Viral Mutation The first polymorphic virus called â€Å"Chameleon† became known in the early '90s, but the problem with polymorphic viruses became really serious only a year after that, in April 1991, with the worldwide epidemic of the polymorphic virus â€Å"Tequila† (as far as I know Russia was untouched by the epidemic; the first epidemic in Russia, caused by a polymorphic virus, happened as late as in 1994, in three years, the virus was called â€Å"Phantom1†). The idea of self encrypting polymorphic viruses gained popularity and brought to life generators of polymorphic code – in early 1992 the famous â€Å"Dedicated† virus appears, based on the first known polymorphic generator MtE and the first in a series of MtE-viruses; shortly after that there appears the polymorphic generator itself. It is essentially an object module (OBJ file), and now to get a polymorphic mutant virus from a conventional non-encrypting virus it is sufficient to simply link their object modules together – the polymorphic OBJ file and the virus OBJ file. Now to create a real polymorphic virus one doesn't have to dwell on the code of his own encryptor/decryptor. He may now connect the polymorphic generator to his virus and call it from the code of the virus when desired. Luckily the first MtE-virus wasn't spread and did not cause epidemics. In their turn the anti-virus developers had sometime in store to prepare for the new attack. In just a year production of polymorphic viruses becomes a â€Å"trade†, followed by their â€Å"avalanche† in 1993. Among the viruses coming to my collection the volume of polymorphic viruses increases. It seems that one of the main directions in this uneasy job of creating new viruses becomes creation and debugging of polymorphic mechanism, the authors of viruses compete not in creating the toughest virus but the toughest polymorphic mechanism instead. This is a partial list of the viruses that can be called 100 percent polymorphic (late 1993): Bootache, CivilWar (four versions), Crusher, Dudley, Fly, Freddy, Ginger, Grog, Haifa, Moctezuma (two versions), MVF, Necros, Nukehard, PcFly (three versions), Predator, Satanbug, Sandra, Shoker, Todor, Tremor, Trigger, Uruguay (eight versions). These viruses require special methods of detection, including emulation of the viruses executable code, mathematical algorithms of restoring parts of the code and data in virus etc. Ten more new viruses may be considered non-100 percent polymorphic (that is they do encrypt themselves but in decryption routine there always exist some nonchanging bytes): Basilisk, Daemaen, Invisible (two versions), Mirea (several versions), Rasek (three versions), Sarov, Scoundrel, Seat, Silly, Simulation. However to detect them and to restore the infected objects code decrypting is still required, because the length of nonchanging code in the decryption outine of those viruses is too small. Polymorphic generators are also being developed together with polymorphic viruses. Several new ones appear utilizing more complex methods of generating polymorphic code. They become widely spread over the bulletin board systems as archives containing object modules, documentation and examples of use. By the end of 1993 there are seven known generators of polymorphic code. They are: MTE 0. 90 (Mutation Engine), TPE (Trident Polymorphic Engine), four versions NED (Nuke Encryption Device), DAME (Dark Angel's Multiple Encryptor) Since then every year brought several new polymorphic generators, so there is little sense in publishing the entire lists. Automating Production and Viral Construction Sets Laziness is the moving force of progress (to construct the wheel because that's too lazy to carry mammoths to the cave). This traditional wisdom needs no comments. But only in the middle of 1992 progress in the form of automating production touched the world of viruses. On the fifth of July 1992 the first viral code construction set for IBM PC compatibles called VCL (Virus Creation Laboratory) version 1. 00 is declared for production and shipping. This set allows to generate well commented source texts of viruses in the form or assembly language texts, object modules and infected files themselves. VCL uses standard windowed interface. With the help of a menu system one can choose virus type, objects to infect (COM or/and EXE), presence or absence of self encryption, measures of protection from debugging, inside text strings, optional 10 additional effects etc. Viruses can use standard method of infecting a file by adding their body to the end of file, or replace files with their body destroying the original content of a file, or become companion viruses. And then it became much easier to do wrong: if you want somebody to have some computer trouble just run VCL and within 10 to 15 minutes you have 30-40 different viruses you may then run on computers of your enemies. A virus to every computer! The further the better. On the 27th of July the first version of PS-MPC (Phalcon/Skism Mass-Produced Code Generator). This set does not have windowed interface, it uses configuration file to generate viral source code. This file contains description of the virus: the type of infected files (COM or EXE); resident capabilities (unlike VCL, PS-MPC can also produce resident viruses); method of installing the resident copy of the virus; self encryption capabilities; the ability to infect COMMAND. COM and lots of other useful information. Another construction set G2 (Phalcon/Skism's G2 0. 70 beta) has been created. It supported PS-MPC configuration files, however allowing much more options when coding the same functions. The version of G2 I have is dated the first of January 1993. Apparently the authors of G2 spent the New Year's Eve in front of their computers. They'd better have some champagne instead, this wouldn't hurt anyway. So in what way did the virus construction sets influence electronic wildlife? In my virus collection there are: †¢ several hundreds of VCL and G2 based viruses; †¢ over a thousand PS-MPC based viruses. So we have another tendency in development of computer viruses: the increasing number of â€Å"construction set† viruses; more unconcealably lazy people join the ranks of virus makers, downgrading a respectable and creative profession of creating viruses to a mundane rough trade. Outside DOS The year 1992 brought more than polymorphic viruses and virus construction sets. The end of the year saw the first virus for Windows, which thus opened a new page in the history of virus making. Being small (less than 1K in size) and absolutely harmless this non resident virus quite proficiently infected executables of new Windows format (NewEXE); a window into the world of Windows was opened with its appearance on the scene. After some time there appeared viruses for OS/2, and January 1996 brought the first Windows95 virus. Presently not a single week goes by without new viruses infecting non-DOS systems; possibly the problem of non-DOS viruses will soon become more important than the problem of DOS viruses. Most likely the process of changing priorities will resemble the process of DOS dying and new operating systems gaining strength together with their specific programs. As soon as all the existing software for DOS will be replaced by their Windows, Windows95 and OS/2 analogues, the problem of DOS viruses becomes nonexistent and purely theoretical for computer society. The first attempt to create a virus working in 386 protected mode was also made in 1993. It was a boot virus â€Å"PMBS† named after a text string in its body. After boot up from infected drive this virus switched to protected mode, made itself supervisor and then loaded DOS in virtual window mode V86. Luckily this virus was born dead – its second generation refused to propagate due to several errors in the code. Besides that the infected system â€Å"hanged† if some of the programs tried to reach outside the V86 mode, for example to determine the presence of extended memory. This unsuccessful attempt to create supervisor virus remained the only one up to spring of 1997, when one Moscow prodigy released â€Å"PM. Wanderer† – a quite successful implementation of a protected mode virus. It is unclear now whether those supervisor viruses might present a real problem for users and anti-virus program developers in the future. Most likely not because such viruses must â€Å"go to sleep† while new operating systems (Windows 3. xx, Windows95/NT, OS/2) are up and running, allowing for easy detection and killing of the virus. But a full-scale stealth supervisor virus may mean a lot of trouble for â€Å"pure† DOS users, because it is absolutely impossible to detect such a stealth virus under pure DOS. Macro Virus Epidemics August 1995. All the progressive humanity, The Microsoft and Bill Gates personally celebrate the release of a new operating system Windows95. With all that noise the message about a new virus using basically new methods of infection came virtually unnoticed. The virus infected Microsoft Word documents. Frankly it wasn't the first virus infecting Word documents. Earlier before anti-virus companies had the first experimental example of a virus on their hands, which copied itself from one document to another. However nobody paid serious attention to that not quite successful experiment. As a result virtually all the anti-virus companies appeared not ready to what came next – macro virus epidemics – and started to work out quick but inadequate steps in order to put an end to it. For example several companies almost simultaneously released documents- anti-viruses, acting along about the same lines as did the virus, but destroying it instead of propagation. By the way it became necessary to correct anti-virus literature in a hurry because earlier the question, â€Å"Is it possible to infect a computer by simply reading a file† had been answered by a definite â€Å"No way! with lengthy proofs of that. As for the virus which by that time got its name, â€Å"Concept†, continued its ride of victory over the planet. Having most probably been released in some division of Microsoft â€Å"Concept† ran over thousands if not millions of computers in no time it all. It's not unusual, because text exchange in the format of Microsoft Word became in fact one of the industry standards, and to get infected by the virus it is sufficient just to open the infected document, then all the documents edited by infected copy of Word became infected too. As a result having received an infected file over the Internet and opened it, the unsuspecting user became â€Å"infection peddler†, and if his correspondence was made with the help of MS Word, it also became infected! Therefore the possibility of infecting MS Word multiplied by the speed of Internet became one of the most serious problems in all the history of existence of computer viruses. In less than a year, sometime in summer of 1996, there appeared the â€Å"Laroux† virus, infecting Microsoft Excel spreadsheets. As it had been with â€Å"Concept†, these new virus was discovered almost simultaneously in several companies. The same 1996 witnessed the first macro virus construction sets, then in the beginning of 1997 came the first polymorphic macro viruses for MS Word and the first viruses for Microsoft Office97. The number of various macro viruses also increased steadily reaching several hundreds by the summer of 1997. Macro viruses, which have opened a new page in August 1995, using all the experience in virus making accumulated for almost 10 years of continuous work and enhancements, actually do present the biggest problem for modern virology. Chronology of Events It's time to give a more detailed description of events. Let's start from the very beginning. Late 1960s – early 1970s Periodically on the mainframes at that period of time there appeared programs called â€Å"the rabbit†. These programs cloned themselves, occupied system resources, thus lowering the productivity of the system. Most probably â€Å"rabbits† did not copy themselves from system to system and were strictly local phenomena – mistakes or pranks by system programmers servicing these computers. The first incident which may be well called an epidemic of â€Å"a computer virus†, happened on the Univax 1108 system. The virus called â€Å"Pervading Animal† merged itself to the end of executable files – virtually did the same thing as thousands of modern viruses do. The first half of 1970s â€Å"The Creeper† virus created under the Tenex operating system used global computer networks to spread itself. The virus was capable of entering a network by itself by modem and transfer a copy of itself to remote system. â€Å"The Reeper† anti-virus program was created to fight this virus, it was the first known anti-virus program. Early 1980s Computers become more and more popular. An increasing number of program appears written not by software companies but by private persons, moreover, these programs may be freely distributed and exchanged through general access servers – BBS. As a result there appears a huge number of miscellaneous â€Å"Trojan horses†, programs, doing some kind of harm to the system when started. 1981 â€Å"Elk Cloner† bootable virus epidemics started on Apple II computers. The virus attached itself to the boot sector of diskettes to which there were calls. It showed itself in many ways – turned over the display, made text displays blink and showed various messages. 1986 The first IBM PC virus â€Å"Brain† pandemic began. This virus infecting 360 KB diskettes became spread over the world almost momentarily. The secret of a â€Å"success† like this late probably in total unpreparedness of computer society to such a phenomenon as computer virus. The virus was created in Pakistan by brothers Basit and Amjad Farooq Alvi. They left a text message inside the virus with their name, address and telephone number. According to the authors of the virus they were software vendors, and would like to know the extent of piracy in their country. Unfortunately their experiment left the borders of Pakistan. It is also interesting that the â€Å"Brain† virus was the first stealth virus, too – if there was an attempt to read the infected sector, the virus substituted it with a clean original one. Also in 1986 a programmer named Ralph Burger found out that a program can create copies of itself by adding its code to DOS executables. His first virus called â€Å"VirDem† was the demonstration of such a capability. This virus was announced in December 1986 at an underground computer forum, which consisted of hackers, specializing at that time on cracking VAX/VMS systems (Chaos Computer Club in Hamburg). 1987 â€Å"Vienna† virus appears. Ralph Burger, whom we already now, gets a copy of this virus, disassembles it, and publishes the result in his book â€Å"Computer Viruses: a High-tech Disease†. Burger's book made the idea of writing viruses popular, explained how to do it, and therefore stimulated creating up hundreds and in thousands of computer viruses, in which some of the ideas from his book were implemented. Some more IBM PC viruses are being written independently in the same year. They are: â€Å"Lehigh†, infecting the COMMAND. COM file only; â€Å"Suriv-1† a. k. a. â€Å"April1st†, infecting COM files; â€Å"Suriv-2†, infecting (for the first time ever) EXE files; and â€Å"Suriv-3†, infecting both COM and EXE files. There also appear several boot viruses (â€Å"Yale† in USA, â€Å"Stoned† in New Zealand, â€Å"PingPong† in Italy), and the first self encrypting file virus â€Å"Cascade†. Non-IBM computers are also not forgotten: several viruses for Apple Macintosh, Commodore Amiga and Atari ST have been detected. In December of 1987 there was the first total epidemics of a network virus called â€Å"Christmas Tree†, written in REXX language and spreading itself under the VM/CMS operating environments. On the ninth of December this virus was introduced into the Bitnet network in one of West German universities, then via gateway it got into the European Academic Research Network (EARN) and then into the IBM Vnet. In four days (Dec. 13) the virus paralyzed the network, which was overflowing with copies of it (see the desk clerk example several pages earlier). On start-up the virus output an image of the Christmas tree and then sent copies of itself to all the network users whose addresses were in the corresponding system files NAMES and NETLOG. 1988 On Friday the 13 1988 several companies and universities in many countries of the world â€Å"got acquainted† with the â€Å"Jerusalem† virus. On that day the virus was destroying files which were attempted to be run. Probably this is one of the first MS-DOS viruses which caused a real pandemic, there were news about infected computers from Europe, America and the Middle East. Incidentally the virus got its name after one of the places it stroke – the Jerusalem University. â€Å"Jerusalem† together with several other viruses (â€Å"Cascade†, â€Å"Stoned†, â€Å"Vienna†) infected thousands of computers still being unnoticed – anti-virus programs were not as common then as they are now, many users and even professionals did not believe in the existence of computer viruses. It is notable that in the same year the legendary computer guru Peter Norton announced that computer viruses did not exist. He declared them to be a myth of the same kind as alligators in New York sewers. Nevertheless this delusion did not prevent Symantec from starting its own anti-virus project Norton Anti-virus after some time. Notoriously false messages about new computer viruses started to appear, causing panic among the computer users. One of the first virus hoaxes of this kind belongs to a Mike RoChenle (pronounced very much like â€Å"Microchannel†), who uploaded a lot of messages to the BBS systems, describing the supposed virus copying itself from one BBS to another via modem using speed 2400 baud for that. Funny as it may seem many users gave up 2000 baud standard of that time and lowered the speed of their modems to 1200 baud. Similar hoaxes appeared even now. The most famous of them so far are GoodTimes and Aol4Free. November 1988: a total epidemic of a network virus of Morris (a. k. a. Internet Worm). This virus infected more than 6000 computer systems in USA (including NASA research Institute) and practically paralyzed their work. Because of erratic code of the virus it sent unlimited copies of itself to other network computers, like the â€Å"Christmas Tree† worm virus, and for that reason completely paralyzed all the network resources. Total losses caused by the Morris virus were estimated at 96 millions of dollars. This virus used errors in operating systems Unix for VAX and Sun Microsystems to propagate. Besides the errors in Unix the virus utilized several more original ideas, for example picking up user passwords. A more detailed story of this virus and the corresponding incidents may be found in a rather detailed and interesting articles. December 1988: the season of worm viruses continues this time in DECNet. Worm virus called HI. COM output and image of spruce and informed users that they should â€Å"stop computing and have a good time at home!!! There also appeared new anti-virus programs for example, Doctors Solomon's Anti-virus Toolkit, being one of the most powerful anti-virus software presently. 1989 New viruses â€Å"Datacrime†, â€Å"FuManchu† appear, as do the whole families like â€Å"Vacsina† and â€Å"Yankee†. The first one acted extremely dangerously – from October 13th to December 31st it formatted hard disks. This virus â€Å"broke freeà ¢â‚¬  and caused total hysteria in the mass media in Holland and Great Britain. September 1989: 1 more anti-virus program begins shipping – IBM Anti-virus. October 1989: one more epidemic in DECNet, this time it was worm virus called â€Å"WANK Worm†. December 1989: an incident with a â€Å"Trojan horse† called â€Å"AIDS†. 20,000 copies were shipped on diskettes marked as â€Å"AIDS Information Diskette Version 2. 0†. After 90 boot-ups the â€Å"Trojan† program encrypted all the filenames on the disk, making them invisible (setting a â€Å"hidden† attribute) and left only one file readable – bill for $189 payable to the address P. O. Box 7, Panama. The author of this program was apprehended and sent to jail. One should note that in 1989 there began total epidemics of computer viruses in Russia, caused by the same â€Å"Cascade†, â€Å"Jerusalem† and â€Å"Vienna†, which besieged the computers of Russian users. Luckily Russian programmers pretty quickly discovered the principles of their work, and virtually immediately there appeared several domestic anti-viruses, and AVP (named â€Å"-V†) those time, was one of them. My first acquaintance with viruses (this was the â€Å"Cascade† virus) replaced in the world 1989 when I found virus on my office computer. This particular fact influenced my decision to change careers and create anti-virus programs. In a month the second incident (â€Å"Vacsina† virus) was closed with a help of the first version of my anti-virus â€Å"-V† (minus-virus), several years later renamed to AVP – AntiViral Toolkit Pro. By the end of 1989 several dozens of viruses herded on Russian lands. They were in order of appearance: two versions of â€Å"Cascade†, several â€Å"Vacsina† and â€Å"Yankee† viruses, â€Å"Jerusalem†, â€Å"Vienna†, â€Å"Eddie†, â€Å"PingPong†. 1990 This year brought several notable events. The first one was the appearance of the first polymorphic viruses â€Å"Chameleon† (a. k. a. â€Å"V2P1†, â€Å"V2P2†, and â€Å"V2P6†). Until then the anti-virus programs used â€Å"masks† – fragments of virus code – to look for viruses. After â€Å"Chameleon†Ã¢â‚¬Ëœs appearance anti-virus program developers had to look for different methods of virus detection. The second event was the appearance of Bulgarian â€Å"virus production factory†: enormous amounts of new viruses were created in Bulgaria. Disease wears the entire families of viruses â€Å"Murphy†, â€Å"Nomenclatura†, â€Å"Beast† (or â€Å"512†, â€Å"Number-of-Beast†), the modifications of the â€Å"Eddie† virus etc. A certain Dark Avenger became extremely active, making several new viruses a year, utilizing fundamentally new algorithms of infecting and covering of the tracks in the system. It was also in Bulgaria that the first BBS opens, dedicated to exchange of virus code and information for virus makers. In July 1990 there was an incident with â€Å"PC Today† computer magazine (Great Britain). It contained a floppy disk infected with â€Å"DiskKiller† virus. More than 50,000 copies were sold. In the second half of 1990 there appeared two Stealth monsters – â€Å"Frodo† and â€Å"Whale†. Both viruses utilized extremely complicated stealth algorithms; on top of that the 9KB â€Å"Whale† used several levels of encrypting and anti-debugging techniques. 1991 Computer virus population grows continuously, reaching several hundreds now. Anti-viruses also show increasing activity: two software monsters at once (Symantec and Central Point) issue their own anti-virus programs – Norton Anti-virus and Central Point Anti-virus. They are followed by less known anti-viruses from Xtree and Fifth Generation. In April a full-scale epidemic broke out, caused by file and boot polymorphic virus called â€Å"Tequila†, and in September the same kind of story happened with â€Å"Amoeba† virus. Summer of 1991: â€Å"Dir_II† epidemic. It was a link virus using fundamentally new methods of infecting files. 1992 Non-IBM PC and non-MS-DOS viruses are virtually forgotten: â€Å"holes† in global access network are closed, errors corrected, and network worm viruses lost the ability to spread themselves. File-, boot- and file-boot viruses for the most widely spread operating system (MS-DOS) on the most popular computer model (IBM PC) are becoming more and more important. The number of viruses increases in geometrical to progression; various virus incidents happen almost every day. Miscellaneous anti-virus programs are being developed, dozens of books and several periodic magazines on anti-viruses are being printed. A few things stand out: Early 1992: the first polymorphic generator MtE, serving as a base for several polymorphic viruses which follow almost immediately. Mte was also the prototype for a few forthcoming polymorphic generators. March 1992: â€Å"Michelangelo† virus epidemics (a. k. a. â€Å"March6†) and the following hysteria took place. Probably this is the first known case when anti-virus companies made fuss about this virus not to protect users from any kind of danger, but attract attention to their product, that is to create profits. One American anti-virus company actually announced that on the 6th of March the information on over five million computers will be destroyed. As a result of the fuss after that the profits of different anti-virus companies jumped several times; in reality only about 10,000 computers suffered from that virus. July 1992: The first virus construction sets were made, VCL and PS-MPC. They made large flow of new viruses even larger. They also stimulated virus makers to create other, more powerful, construction sets, as it was done by MtE in its area. Late 1992: The first Windows virus appears, infecting this OS's executables, and starts a new page in virus making. 1993 Virus makers are starting to do some serious damage: besides hundreds of mundane viruses which are no different than their counterparts, besides the whole polymorphic generators and construction sets, besides new electronic editions of virus makers there appear more and more viruses, using highly unusual ways of infecting files, introducing themselves into the system etc. The main examples are: â€Å"PMBS†, wording in Intel 80386 protected mode. Strange† (or â€Å"Hmm†) – a â€Å"masterpiece† of Stealth technology, however fulfilled on the level of hardware interrupts INT 0Dh and INT 76h. â€Å"Shadowgard† and â€Å"Carbunkle†, which widened debt range of algorithms of companion viruses. â€Å"Emmie†, â€Å"Metallica†, â€Å"Bomber†, â€Å"Uruguay† and â€Å"Cruncher† – the us e of fundamentally new techniques of â€Å"hiding† of its own code inside the infected files. In spring of 1993 Microsoft made its own anti-virus MSAV, based on CPAV by Central Point. 1994 The problem of CD viruses is getting more important. Having quickly gained popularity CD disks became one of the main means of spreading viruses. There are several simultaneous cases when a virus got to the master disk when preparing the batch CDs. As a result of that a fairly large number (tens of thousands) of infected CDs hit the market. Of course they cannot be cured, they just have to be destroyed. Early in the year in Great Britain there popped out two extremely complicated polymorphic viruses, â€Å"SMEG. Pathogen† and â€Å"SMEG. Queeg† (even now not all the anti-virus programs are able to give 100% correct detection of these viruses). Their author placed infected files to a BBS, causing real panic and fear of epidemics in mass media. Another wave of panic was created by a message about a supposed virus called â€Å"GoodTimes†, spreading via the Internet and infecting a computer when receiving E-mail. No such virus really existed, but after some time there appeared a usual DOS virus containing text string â€Å"Good Times†. It was called â€Å"GT-Spoof†. Law enforcement increases its activities: in Summer of 1994 the author of SMEG was â€Å"sorted out† and arrested. Approximately at the same time also in Great Britain there was arrested an entire group of virus makers, who called themselves ARCV (Association for Really Cruel Viruses). Some time later one more author of viruses was arrested in Norway. There appear some new unusual enough viruses: January 1994: â€Å"Shifter† – the first virus infecting object modules (OBJ files). â€Å"Phantom1† – the cause of the first epidemic of polymorphic virus in Moscow. April 1994: â€Å"SrcVir† — the virus family infecting program source code (C and Pascal). June 1994: â€Å"OneHalf† – one of the most popular viruses in Russia so far starts a total epidemics. September 1994: â€Å"3APA3A† – a boot-file virus epidemic. This virus uses a highly unusual way of incorporating into MS-DOS. No anti-virus was ready to meet such kind of a monster. In 1994 (Spring) one of the anti-virus leaders of that time – Central Point – ceased to exist, acquired by Symantec, which by that time managed to â€Å"swallow† several minor companies, working on anti- viruses – Peter Norton Computing, Cetus International and Fifth Generation Systems. 1995 Nothing in particular among DOS viruses happens, although there appear several complicated enough monster viruses like â€Å"NightFall†, â€Å"Nostardamus†, â€Å"Nutcracker†, also some funny viruses like â€Å"bisexual† virus â€Å"RMNS† and BAT virus â€Å"Winstart†. The â€Å"ByWay† and â€Å"DieHard2† viruses become widespread, with news about infected computers coming from all over the world. February 1995: an incident with Microsoft: Windows95 demos disks are infected by â€Å"Form†. Copies of these disks were sent to beta testers by Microsoft; one of the testers was not that lazy and tested the disks for viruses. Spring 1995: two anti-virus companies – ESaSS (ThunderBYTE anti-virus) and Norman Data Defense (Norman Virus Control) announce their alliance. These companies, each making powerful enough anti- viruses, joined efforts and started working on a joint anti-virus system. August 1995: one of the turning points in the history of viruses and anti-viruses: there has actually appeared the first â€Å"alive† virus for Microsoft Word (â€Å"Concept†). In some month the virus â€Å"tripped around the world†, pesting the computers of the MS Word users and becoming a firm No. 1 in statistic research held by various computer titles. 1996 January 1996: two notable events – the appearance of the first Windows95 virus (â€Å"Win95. Boza†) and the epidemics of the extremely complicated polymorphic virus â€Å"Zhengxi† in St. Petersburg (Russia). March 1996: the first Windows 3. virus epidemic. The name of the virus is â€Å"Win. Tentacle†. This virus infected a computer network a hospital and in several other institutions in France. This event is especially interesting because this was the FIRST Windows virus on a spree. Before that time (as far as I know) all the Windows viruses had been living only in collections a nd electronic magazines of virus makers, only boot viruses, DOS viruses and macro viruses were known to ride free. June 1996: â€Å"OS2. AEP† – the first virus for OS/2, correctly infecting EXE files of this operating system. Earlier under OS/2 there existed only the viruses writing themselves instead of file, destroying it or acting as companions. July 1996: â€Å"Laroux† – the first virus for Microsoft Excel caught live (originally at the same time in two oil making companies in Alaska and in southern African Republic). The idea of â€Å"Laroux†, like that of Microsoft Word viruses, was based on the presence of so-called macros (or Basic programs) in the files. Such programs can be included into both electronic spreadsheets of Microsoft Excel and Microsoft Word documents. As it turned out the Basic language built into Microsoft Excel also allows to create viruses. December 1996: â€Å"Win95. Punch† – the first â€Å"memory resident† virus for Windows95. It stays in the Windows memory as a VxD driver, hooks file access and infects Windows EXE files that are opened. In general the year 1996 is the start of widespread virus intervention into the Windows32 operating system (Windows95 and WindowsNT) and into the Microfoft Office applications. During this and the next year several dozens of Windows viruses and several hunsdreds of macro viruses appeared. Many of them used new technologies and methods of infection, including stealth and polymorphic abilities. That was the next round of virus evolution. During two years they repeated the way of improving similar to DOS viruses. Step by step they started to use the same features that DOS viruses did 10 years beforehand, but on next technological level. 1997 February 1997: â€Å"Linux. Bliss† – the first virus for Linux (a Unix clone). This way viruses occupied one more â€Å"biological† niche. February-April 1997: macro viruses migrated to Office97. The first of them turned out to be only â€Å"converted† to the format macro viruses for Microsoft Word 6/7, but also virtually immediately there appeared viruses aimed at Office97 documents exclusively. March 1997: â€Å"ShareFun† – macro-virus hitting Microsoft Word 6/7. It uses is not only standard features of Microsoft Word to propagate but also sends copies of itself via MS-Mail. April 1997: â€Å"Homer† – the first network worm virus, using File Transfer Protocol (FTP) for propagation. June 1997: There appears the first self encrypting virus for Windows95. This virus of Russian origin has been sent to several BBS is in Moscow which caused an epidemic. November 1997: The â€Å"Esperanto† virus. This is the first virus that intends to infect not only DOS and Windows32 executable files, but also spreads into the Mac OS (Macintosh). Fortunately, the virus is not able to spread cross the platforms because of bugs. December 1997: new virus type, the so-called â€Å"mIRC Worms†, came into being. The most popular Windows Internet Relay Chat (IRC) utility known as mIRC proved to be â€Å"hole† allowing virus scripts to transmit themselves along the IRC-channels. The next IRC version blocked the hole and the mIRC Worms vanished. The KAMI ltd. nti-virus department has braked away from the mother company constituting the independent one what, certainly, is considered the main event of 1997. Currently the company known as Kaspersky Labs and proved to be a recognized leader of the anti-virus industry. Since 1994 the AntiViral Toolkit Pro (AVP) anti-virus scanner, main product of the company, constantly shows high results wh ile being tested by various test laboratories of all world. Creation of an independent company gave the chance to the at first small group of developers to gain the lead on the domestic market and prominence on the world one. For short run versions for practically all popular platforms were developed and released, the new anti-virus solutions offered, the international distribution and the product support networks created. October 1997: the agreement on licensing of AVP technologies use in F-Secure Anti-Virus (FSAV) was signed. The F-Secure Anti-Virus (FSAV) package was the DataFellows (Finland) new anti-virus product. Before DataFellows was known as the F-PROT anti-virus package manufacturer. 1997 was also the year of several scandals between the anti-virus main manufacturers in US and Europe. At the year beginning McAfee has announced that its experts have detected a â€Å"feature† in the antivirus programs of Dr. Solomon, one of its main competitors. The McAfee testimony stated that if the Dr. Solomon's antivirus while scanning detects several virus-types the program switches to the advanced scanning mode. What means that while scanning some uninfected computer the Dr. Solomon's anti-virus operates in the usual mode and switches to the advanced mode – â€Å"cheat mode† according to McAfee – enabling the application to detect the invisible for the usual mode viruses while testing virus collections. Consequently the Dr. Solomon's anti-virus shows both good speed while scanning uninfected disks and good virus detection ability while scanning virus collections. A bit later Dr. Solomon stroked back accusing McAfee of the incorrect advertising campaign. The claims were raised to the text – â€Å"The Number One Choice Worldwide. No Wonder The Doctor's Left Town†. At the same time McAfee was in the court together with Trend Micro, another antivirus software manufacturer, concerning the Internet and e-mail data scanning technology patent violation. Symantec also turned out to be involved in the cause and accused McAfee of using the Symantec codes in the McAfee products. And etc. The year completion by one more noteworthy event related to McAfee-name was marked – McAfee Associates and Network General have declared consolidation into the new born Network Associates company and positioning of their services not only on the anti-virus protection software market, but also on the markets of computer safety universal systems, encryption and network administration. From this the virus and anti-virus history point McAfee would correspond to NAI. 998 The virus attack on MS Windows, MS Office and the network applications does not weaken. There arose new viruses employing still more complex strokes while infecting computers and advanced methods of network-to-computer penetration. Besides numerous the so-called Trojans, stealing Internet access passwords, and several kinds of the latent administration utilities came into the computer world. Several incidents with the infected CDs were revealed – Some computer media publishers distributed CIH and Marburg (the Windows viruses) through CDs attached to the covers of their issues, with infected. The year beginning: Epidemic of the â€Å"Win32. HLLP. DeTroie† virus family, not just infecting Windows32 executed files but also capable to transmit to the â€Å"owner† the information on the computer that was infected, shocked the computer world. As the viruses used specific libraries attached only to the French version of Windows, the epidemic has affected just the French speaking countries. February 1998: One more virus type infecting the Excel tables â€Å"Excel4. Paix† (aka â€Å"Formula. Paix) was detected. This type of a macro virus while rooting into the Excel tables does not employ the usual for the kind of viruses macro area but formulas that proved to be capable of the self-reproduction code accommodation. February – March 1998: â€Å"Win95. HPS† and â€Å"Win95. Marburg† – the first polymorphous Windows32-viruses were detected and furthermore they were â€Å"in-the-wild†. The anti-virus programs developers had nothing to do but rush to adjust the polymorphous viruses detecting technique, designed so far just for DOS-viruses, to the new conditions. March 1998: â€Å"AccessiV† – the first Microsoft Access virus was born. There was no any boom about that (as it was with â€Å"Word. Concept† and â€Å"Excel. Laroux† viruses) as the computer society already got used to that the MS Office applications go down thick and fast. March 1998: The â€Å"Cross† macro-virus, the first virus infecting two different MS Office applications – Access and Word, is detected. Hereupon several more viruses transferring their codes from one MS Office application to the other have emerged. May 1998 – The â€Å"RedTeam† virus infects Windows EXE-files and dispatches the infected files through Eudora e-mail. June 1998 – The â€Å"Win95. CIH† virus epidemic at the beginning was mass, then became global and then turned to a kind of computer holocaust – quantity of messages on computer networks and home personal computers infection came to the value of hundreds if not thousands pierces. The epidemic beginning was registered in Taiwan where some unknown hacker mailed the infected files to local Internet conferences. Therefrom virus has made the way to USA where through the staff oversight infected at once several popular Web servers that started to distribute infected game programs. Most likely these infected files on game servers brought about this computer holocaust that dominated the computer world all the year. According to the â€Å"popularity† ratings the virus pushed â€Å"Word. CAP† and â€Å"Excel. Laroux† to second cabin. One should also pay attention to the virus dangerous manifestation – depending on the current date the virus erased Flash BIOS what in some conditions could kill motherboard. August 1998: Nascence of the sensational â€Å"BackOrifice† (â€Å"Backdoor. BO†) – utility of latent (hacker's) management of remote computers and networks. After â€Å"BackOrifice† some other similar programs – â€Å"NetBus†, â€Å"Phase† and other – came into being. Also in August the first virus infecting the Java executed files – â€Å"Java. StangeBrew† – was born. The virus was not any danger to the Internet users as there was no way to employ critical for the virus replication functions on any remote computer. However it revealed that even the Web servers browsers could be attacked by viruses. November 1998: â€Å"VBScript. Rabbit† – The Internet expansion of computer parasites proceeded by three viruses infecting VisualBasic scripts (VBS files), which being actively used in Web pages development. As the logical consequence of VBScript-viruses the full value HTML-virus (â€Å"HTML. Internal†) was born to life. Virus-writers obviously turned their efforts to the network applications and to the creation of full value Network Worm-Virus that could employ the MS Windows and Office options, infect remote computers and Web-servers or/and could aggressively replicate itself through e-mail. The anti-virus manufacturers world was also considerably rearranged. In May 1998 Symantec and IBM announced the union of their forces on the anti-virus market. The collective product would be under the Norton Anti-Virus trade mark distributed and the IBM Anti-Virus (IBMAV) program is liquidated. Response of the main competitors, Dr. Solomon and NAI (former McAfee), followed immediately. They issued the press-releases offering the IBM product users to promotionally replace the dead anti-virus with their own products. Less then one month later Dr. Solomon â€Å"committed suicide†. The

Wednesday, October 23, 2019

Early Childhood Literacy Proposal Essay

Abstract Research on early childhood literacy pinpoints the early childhood years as the foundational base period for developing the language and literacy skills that are fundamental to a young child’s long term developmental success in reading and writing. This study places theoretical attention on the essential components of literacy that promote and predict the essential emergent literacy development of a child. This efficacious aspect of learning acquisition is critically pertinent for the school readiness of a child in being well read. Findings support and highlight how the acquiring of skills in components of literacy such as phonological awareness, vocabulary and language knowledge, alphabet and sound recognition, print and text comprehension as well as the use of sound instructional practices and strategies among teachers will promote the optimal level of success in early literacy and beyond. Introduction Early childhood literacy is an emphatic, essential, and extensive branch of education that seeks to equip young children with the optimal skills that will cause them to emerge in reading and writing. These foundational skills are critical and predictive of one’s diagnosis of success within these parameters. Research notes that depending on where they start, their experiences in the home, and the curriculum being used in their classroom, many children will leave preschool with early literacy skills that put them on a trajectory to transition successfully to learning to read (Lonigan, Allan, & Lerner, 2011). To signify, the essence of these skills is manifested early in one’s life and is the predecessor of one’s future achievement in literacy. The developmental stage for the actual acquiring of these precursor skills begins in infancy and extends to the primary years. However, it is important to note that for the purpose of this study, early literacy skills will be based on those skills that occur at the preschool ages of 3-4. Then too, within this digest, it is important to note that effective preschool programs are the panels of early education that promote, support, and contribute to the child’s future reading and writing readiness. These factors characterize the role of early childhood programs in promoting children’s early literacy development for later achievement in reading. The acquisition of children’s reading skills was once thought to originate with the start of reading instruction in elementary school, but research now supports the idea that learning to read is a continuous developmental process that emerges early in life (Wilson & Longman, 2009). For this purpose, a study has been proposed to increase the focus on the early years of education as the precursor for later success in literacy and to discover those early literacy skills that foster success in literacy and inform of the assessments and strategies that are the best practices for providing this evidence. The following research question and hypotheses were made declarative or stated as a guide for this proposal: Research question: Does the acquisition of early literacy skills foster future success in literacy? Hypotheses: The acquisition of early literacy skills fosters future success in literacy. Subsequent Hypotheses: 1) Literacy rich environments or settings contribute to a child’s future success in reading. 2) Effective teaching strategies support a child’s development of literacy. These modes and mechanisms form the basis for providing children with an effective curriculum, strategies, techniques, and activities that will empower their knowledge and give them a sound foundation of emergent literacy. The very term emergent literacy is a relatively new one that evolved in response to evidence that literacy development occurs along a continuum that begins long before children actually start formal schooling and long before they acquire conventional literacy skills such as decoding, oral reading, reading comprehension, spelling and writing (Invernizzi, Landrum, Teichman, & Townsend, 2010). To note, the learning phase of literacy for children begins at birth and extends to the preschool phase and beyond. Infants begin to grasp books and take them to caregivers of parents to read. Around the age of two, children begin to recognize favorite books by cover and can memorize and restate some of the words. Between the ages of three and four, children are able to picture read and retell stories as well as manipulate letters and print. At the ages of five and six, children then begin to understand that words have meaning. The emergent skills and abilities that are strong predictors of future progression and succession in later reading and writing outcomes include the following: 1) Phonological Sensitivity- Children begin to hear and understand various sounds and patterns of spoken language. More specifically, these skills begin with listening to sounds and then noticing and discriminating rhyme and alliteration. Afterwards children begin to determine syllables in words by examining onset and rime. Phonological awareness skills generally graduate to advanced phonemic awareness skills and later lay the foundation for the gaining of phonics. They are further progressed and promoted as children sing songs; hear stories, and finger plays or rhymes (Heroman & Jones, 2010). Research has found phonological awareness skills in preschool to be one of the most robust predictors of early reading success in a child’s first few years of formal schooling† (Callaghan & Madelaine, 2012). 2) Print Knowledge- Children’s ability to organize and convey meaning of words through sounds, words, or sentences. The conventions of print that are modeled by teachers and learned by children and that eventually help to bring awareness to the functions of print include providing print rich environments, interacting during story times, watching adults write and read books. 3) Alphabet Knowledge-Children begin to recognize letters and their sounds to printed letters. A child’s knowledge of the alphabet is the single best predictor of first-year reading success (Elliot & Olliff, 2008). Children who are exposed to alphabetic activities and experiences such as reading books that display the alphabet, manipulating magnetic or textured alphabets, playing games that reference the alphabet, as well as singing and saying the alphabet have increased letter knowledge that will eventually promote reading and writing achievement. It was found that knowledge of letter names prior to kindergarten was predictive of reading ability in fifth and tenth grade (Wilson & Lonigan, 2008). 4) Comprehension-Children make meaning of text by being able to process stories they have heard read aloud. They are also provided with language rich activities, directions, and instructions as a way to understand and communicate knowledge. Teachers can promote listening and story comprehension skills by doing the following: * Talk with children frequently throughout the day * Use language that is easy for children to understand * Help children understand language by rephrasing it when necessary * Play listening games * Help children learn to follow and give directions * Read aloud to small groups of children * Prepare children for a reading by taking a â€Å"picture walk† * Show children the pictures as you read. * When reading to children, encourage them to ask questions, make predictions, talk about the story, and connect new ideas with what they already know * Facilitate story retellings (Heroman & Jones, 2010). Review of Related Literature A review of the research literature reveals how early childhood literacy and learning governs the academic research among young children. The use of early literacy assessments as evidence of directly measuring student’s knowledge is examined as the way to understand children’s development in literacy and ascertaining what counts as student learning. The early literacy instruction take the form of isolated activities and skills that could be easily documented, measured, quantified or qualified as the condition for evaluating the prerequisite skills for eventual success in formal reading and writing. Children are assessed on how many alphabets they know; how many sight words they can recognize; how they distinguish individual sounds or phonemes in spoken language; how they make connections between letters and sounds; and how they use language to tell stories and share information as the way to individualize or compare a student’s performance (Casbergue, 2010). Children who are at risk for later reading problems have weaker emergent literacy skills than children not at risk for later reading problems. Several studies examining the predictive validity between emergent literacy skills and later reading skills have found that emergent literacy skills are good indicators of whether a child will have trouble with reading in the early elementary grades. Therefore, it is helpful for teachers to be able to measure accurately those emergent skills to determine who is most at risk for later reading problems and implement  interventions geared toward improving emergent literacy skills with at risk children (Wilson & Lonigan, 2009). Research suggests several programs or assessments that will help teachers in identifying, guiding, and implementing those skills that will cause students to gain early responsiveness in literacy. The article, â€Å"Increased Implementation of Emergent Literacy Screening in Pre-Kindergarten focuses on the findings that emphasize how prekindergarten programs are prevalent for ensuring academic success in literacy. The findings suggest that children who attend a good Pre-K program will more than likely not have reading difficulties in later years. The use of emergent literacy assessments by teachers helps in discussing the specific information about literacy development that will assist the teacher in making informed decisions for meeting instructional goals and objectives. These assessments help the teachers to learn what the student knows or what they need to learn while also addressing the teacher’s instructional methods and modes. It was found that these assessments help in identifying a student’s strengths and targets their weaknesses for advanced instructional literacy needs. PALS-PreK which focuses on the alphabet knowledge, phonological awareness, print concepts, and writing skills of students is the tool that measures the progress of students and helps teachers to assess the knowledge and mastery level of the students. This assessment was used to assess the emergent literacy skills of more than 21,000 students prior to Kindergarten as the way to target their performance. It is an easy to use system that is administered to children individually by the classroom teacher and does not rely on an allotted time for completing the assessment (Invernizzi, Landrum, Teichman, & Townsend, 2010). The Creative Curriculum is an ongoing assessment tool that assesses children using specific objective indicators and predictors of standards that pertain to school readiness and the success of children within the field of literacy. This tool requires that teachers write observations or records of children during naturalistic situations in the classroom or during group time as the most accurate way for measuring the literate success of the child. Children will be required to demonstrate phonological awareness, knowledge of the alphabet and sounds, knowledge of print and emerging writing skills as well as respond to books and other text and will be assessed and placed within a color coded mastery level and will  be assessed throughout the school year (Heroman & Jones, 2010). The article Assessment of Preschool Early Literacy Skills: Linking Children’s Educational Needs with Empirically Supported Instructional Activities, Longman, Allan, & Lerner describe preschool as the critical predictive phase of learning wherein children’s early literacy skills are detected, developed, and directed towards them becoming skilled readers and writers. Longman et al provide a research study that supports the crucial role of teachers in providing children with a strong literacy enriched foundational base wherein there is a rich curriculum that includes the necessary activities that will promote their proficiency in literacy. Substantial evidence points to children’s acquired skills in alphabet knowledge, print, phonology, and oral language attributes to the outgrowth and successful achievement levels in their evolving literacy skills. This article further discussed three methods for determining and evaluating the skills of preschool children. Primary forms of assessment which included informal assessments, screening/progress monitoring, and diagnostic assessments were further investigated as it related to the measurement of children’s developmental goals and gains in correlation to the effectiveness of the teacher’s guided instructions and activities. One valid and reliable assessment that is of particular focus is that of diagnostics assessments. Diagnostic assessments are reliable and valid in that they will identify a child’s strengths within a specific set of skills or discipline and expose mastery of it. Then too, these assessments will measure exactly what they are intended to measure. Longman et al contend, â€Å"The key advantage of diagnostic assessments include in depth examination of specific skill areas, generally high reliability, established validity of the measure, and the ability to compare a specific child’s performance with a known reference group† ( Lonigan, Allan, & Lerner, 2011). The authors provide accurate evidence of children’s progress wherein the tests within the above mentioned literacy areas provided high levels of internal consistency and test retest ability wherein the tests were error free and provided accurate scores. The tests also yielded multiple items within the measure that would further index the child’s developmental level within literacy. A further quasi-experimental research was conducted as to how teachers enhance the early literacy skills of preschool children. The research was conducted during the span of two years and across 20 Head start sites. 750 teachers were selected to participate as 370 classrooms conducted pre and posttest assessments. Student performances were examined in comparison of being taught by teachers with either 1 or 2 years of training and instructional experience. It was found that teachers who were more educated were more effective to the student’s overall achievement of early literacy skills (Landry, Swank, Smith, Assel, & Gunnwig). Even further within the research literature on early childhood literacy is the importance of preschool early intervention in literacy. Researchers have examined phonological awareness skills as being robust skills for later conventional literacy skills. The National Center for Family Literacy (NELP) conducted a meta-analysis of more than 299 studies on children between the ages of birth and five years and recognized phonological awareness as one of the most important determinants of early reading success (Callaghan & Madelaine, 2012). Then too, researchers detail the importance of phonological skills being initially taught in preschool due to the phonological sensitivity of children during this age period. It is estimated that preschool children who have a sound foundation of phonological skills will achieve reading skills during later years. Longitudinal studies have traced the performance early literacy skills of preschoolers and subsequent later grades and determined positive literacy outcomes. Research also places a significant amount of focus on the instructions and strategies that will influence the literacy development of preschoolers. Researchers suggested that preschoolers benefited more from shorter periods of intensive literacy instruction during small group settings within a play based curriculum as opposed to longer periods of instruction. The following chart lists the actual activities or skills that teachers use to promote literacy within the classroom. It lists the frequency of the skills as a way to inform the effectiveness or ineffectiveness of the strategies. Language and Literacy Activities in Center-Based Early Childhood Settings (N = 180) | Variable| % Reporting Often or Always| % Reporting Sometimes| % Reporting Seldom or Never| M| SD| Language and Literacy Promotion Scale (23-items)| -| -| -| 4. 17| 0. 64| 1. Read aloud to children in a group setting. | 78. 3| 16. 7| 5. 0| 4. 24| 0. 90| 2. Read aloud to children individually. | 50. 0| 30. 6| 19. 4| 3. 44| 1. 07| 3. Set aside special time each day to read to children. | 75. 0| 19. 4| 5. 6| 4. 13| 0. 97| 4. Read aloud a variety of books. | 85. 6| 9. 4| 5. 0| 4. 34| 0. 87| 5. Reread favorite books. | 82. 8| 12. 8| 4. 4| 4. 28| 0. 90| 6. Talk about books read together. | 68. 9| 20. 6| 10. 6| 3. 95| 1. 11| 7. Ask children questions about the books. | 74. 4| 17. 8| 7. 8| 4. 10| 1. 06| 8. Provide opportunities for children to look at books and other printed materials on own. | 82. 2| 13. 3| 4. 4| 4. 31| 0. 90| 9. Teach children features of a book. | 58. 3| 21. 1| 20. 6| 3. 65| 1. 25| 10. Teach children that printed letters and words run from left to right and from top to bottom. | 63. 3| 19. 4| 17. 2| 3. 74| 1. 21| 11. Practice saying alphabet with the children. | 93. 3| 5. 0| 1. 7| 4. 60| 0. 68| 12. Teach children to recognize letters of alphabet. | 90. 0| 7. 8| 2. 2| 4. 54| 0. 80| 13. Teach children to distinguish between uppercase and lowercase letters. | 69. 4| 20. 6| 10. 0| 3. 98| 1. 19| 14. Help children learn the sounds each letter can represent. | 78. 9| 12. 2| 8. 9| 4. 23| 1. 09| 15. Teach children to write letters of alphabet. | 71. 7| 17. 2| 11. 1| 4. 05| 1. 15| 16. Help children to write their names. | 74. 4| 16. 1| 9. 4| 4. 10| 1. 13| 17. Help children identify different colors, shapes, and sizes. | 88. 3| 8. 3| 3. 3| 4. 57| 0. 80| 18. Help children learn opposites. | 81. 1| 16. 1| 2. 8| 4. 29| 0. 89| 19. Help children recognize numbers. | 87. 2| 8. 9| 3. 9| 4. 46| 0. 83| 20. Practice counting with the children. | 88. 9| 9. 4| 1. 7| 4. 57| 0. 75| 21. Choose books to read aloud that focus on sounds, rhyming, and alliteration. | 77. 2| 16. 7| 6. 1| 4. 16| 0. 93| 22. Have children sing or say a familiar nursery rhyme or song. | 85. 6| 12. 8| 1. 7| 4. 42| 0. 78| 23. Encourage children to make up new verses of familiar songs or rhymes by changing beginning sounds or words. (Green & Peterson, 2006). | 63. 9| 20. 6| 15. 6| 3. 85| 1. 17| Methodology The writer begins by selecting the type of research which will be conducted which is an evaluation research. Two emergent literacy screening tools for preschool age children are used as measureable tools for identifying the acquisition of children’s emergent literacy skills are the Get Ready to Read Tool (GRTR) and the Individual Growth and Development Indicators (IGDI). The GRTR test has 20 activities that strictly measure phonological and print skills. The children are shown a page with four pictures and asked a question that responds to one of the pictures. At the end of the test the scores are tallied for a final comprehensive score. Children master IGDI test by selecting picture cards that respond to questions about Alliteration and Rhyming, Picture Naming, and Phonological awareness skills. Children are given a flashcard within one of the domains and asked a question and prompted to point to the correct answer. The scores consist of the number of correct answers that were completed within a specified amount of time. Both of these tests were administered in July and October with the consent of the parents of the preschool age children and lasted about 40 minutes (Wilson & Lonigan, 2009). Participants For this study, 21 preschools in Florida participated. The children’s ages ranged from 42 to 55 months. There was an equal distribution of boys and girls. 70% of the children were Caucasian, 19% were African American and 11% were of another ethnicity. Conclusion/Recommendation The IGDI performance test scores were worse than those of the GRTR in terms of concurrent validity and reliability due to some of the children being unable to complete the tests. It was determined that the tests were difficult for the age group and therefore were unreliable. The GRTR was more reliable in that it was geared towards the age of the children. The results of the study were clear in that this screener was better for measuring the emergent literacy skills of preschool children as the evidence for later performance in reading. Researchers, educators, and policy makers are concerned with the quality of literacy programs, the effectiveness of literacy instruction, and the achievement of students with the field of literacy. Finding from this study support how early childhood programs promote language and literacy skills for future success in reading and literacy. References Bright From the Start: Georgia’s Department of Early Care and Learning. http://decal. ga. gov/documents/attachments/content_standards_full. pdf Callaghan, G. , & Madelaine, A. (2012). Leveling the Playing Field for Kindergarten entry: Research Implications for Preschool Early Literacy Instruction. Australasian Journal of Early Childhood, 37, 13-23. Casbergue, R. M. (2010). Assessment and Instruction in Early Childhood Education: Early Literacy as a Microcosm of Shifting Perspectives. 13-20 Elliot, E. M. , & Oliff, C. B. (2008). Developmentally Appropriate Emergent Literacy Activities for Young Children: Adapting the Early Literacy and Learning Model. Early Childhood Education Journal, 35, 551-556. Green, S. D. , & Peterson, R. (2006). Language and Literacy Promotion in Early Childhood Setting: A Survey of Center Based Practices. Early Childhood Research and Practice, 14 (1) Heroman, C. , & Jones, C. (2010). The Creative Curriculum for Preschool: Literacy. Vol. 35, 537-567. Invernizzi, M. , Landrum, T. L. , Teichman, A. , & Townsend, M. (2010). Increased Implementation of Emergent Literacy Screening in Pre-Kindergarten. Early Childhood Education Journal, 37, 437-446. Landry, S. Swank, P. R. , Smith, K. E. , & Assel, M. A. (2006). Enhancing Early Literacy Skills for Preschool Children: Bringing a Professional Development Model to Scale. Journal of Learning Disabilities, 39, 306-324. Longman, C. J. , Allan, N. P. , & Lerner, M. D. (2011). Assessment of Preschool Early Literacy Skills: Linking Children’s Educational Needs with Empirically Supported Instructional Activities. Psychology in the Schools, 48, 488-501.