Sunday, January 26, 2020

Theories of Government Control of the Internet

Theories of Government Control of the Internet Critically analyse Lawrence Lessigs argument that the ability of governments to control activities within cyberspace is determined by the codes of cyberspace. The Internet enables individuals to access a ‘new realm of human activity’[1] and has affected the lives of billions of people. Due to the effect that the Internet has on citizens of all states, many called for legal involvement. The increasing use of the Internet for commercial purposes sparked initiatives to attempt to legally regulate the system[2]. Internet traffic is carried over vast communications networks which are owned and controlled by public and private sector providers. The European Commission has had to step in on a number of occasions were a merger of providers would be a breach of competition laws, due to the stake of the market each provider has.[3] Ian Lloyd states that the internet is similar to other forms of communication as it is heavily regulated but it lacks specific legal provisions[4]. The Communications Act 2003 is said to have few provisions regarding internet regulation. At a national level, communication regulation has operated for years and international agencies like the International Telecommunications Union adopts a more functional role towards regulating. As the Internet is a global tool, policing and regulating it seems a considerable legal and political question. Some argue that the internet is governed by internet users as they reach a consensus. Regulatory structures are seen to evolve on their own rather than develop in an organised way. Lawrence Lessig believes that governments’ attempts to regulate the internet will fail. He concedes however, that governments may be able to regulate the architecture of the Internet and in turn it could develop into a form of regulation across all areas[5]. Lessig proposes that internet sites should have greater power to identify customers so as to recognise individuals’ credentials[6]. This form of indirect regulation would form a basis of self-regulation within cyberspace. He states that the state may affect Internet service providers (ISPs) from regulating an aspect which will make it more difficult for it to do business.[7] Further e-commerce will lead to greater involvement of the state due to the commercial nature of these transactions thus making identification of parties easy. Lessig continues by warning that if the Internet is regulated by a ‘closed code’ then the state’s effectiveness to regulate remains unchanged. If however, the Internet adopts an ‘open code’ then it will act as a check on the governments’ power.[8] The internet is defined by a set of protocols (TCP/IP) which are rules for how your computer will interact with a server and vice versa. These protocols make interaction possible as users agree on simple protocol of data exchange. A ‘closed code’ has bothered many which believe that an ‘open code’ fits in with the values of the internet of free and easy file sharing. This code is a public code, which people may view without gaining permission of others and so facilitates transparency. Alternative views state that the rise in e-commerce will result in greater input of the state, but there are problems connected to regulating e-commerce. Rowland Macdonald point out there are inherent difficulties when regulating e-commerce as it is not geographically or jurisdictionally restricted and there are also competing pressures whether to regulate or not to regulate as seen in Lessig’s argument [9]. Lars Davis states that two dangers must be avoided[10]. The first danger is under-regulation, as this would lead to the perception that e-commerce is an activity that contains an unacceptable high element of risk and so it will prevent parties from people participating in commercial activity on the Internet. This decrease in commercial activity will be regardless whether they are commercial entities or they are consumers. The second danger is over-regulation. The market would become rigid and inflexible which can be said to be the Internet’s most appealing feature. This in turn would lead to a stifling in development and perhaps in commercial entities setting up in jurisdictions which have less rigid regulations. Davis states that these ‘regulation havens’ which have a reduced or minimal control is a distinct possibility. The overly strong control could be detrimental to the attractiveness of parties conducting e-commerce. The benefits offered by e-commerce would be lost to markets with less rigid regulations and so economic development would suffer in those countries which have rigid regulations. Rowland Madonald note a further difficulty in deciding where the scope of a particular state’s regulation should extend[11]. They ask the question whether it should extend to ‘businesses that are based in another state but which conduct business with consumers or businesses in the particular state?’ The geographical factors which usually make the scope of a jurisdiction easy to see are blurred when relating to internet commerce. They use the example of the EC Directive on Certain Aspects of Electronic Commerce in the Internal Market[12] to show an attempt to create a ‘balance point’ between member states and regulating e-commerce. The Directive recognises the difficulties which commercial entities face when having to take into account different legal regimes. The ‘country of origin’ principle which EC member states adhere to, allow the regulations of one state the right not to be discriminated against the regulations of another. In other words, once marketed in the home state, it can be marketed in all member states. However, these regulations which provide a balance for e-commerce provide little help when dealing with commercial entities that are not based in the EU. By using the Directive as an example, we see that incompatibility with its clauses regarding e-commerce could result in an action being taken and a case being brought in the European Court of Justice (ECJ) and a judgment against a party. If the commercial entities are not EC member states then there is no authoritative organ which can force a party to comply with the regulations. Amit Sachdeva proposes that rules governing private international law are inadequate to deal with e-commerce[13]. Sachdeva states there are four solutions to the problem of regulating cyberspace and its jurisdiction. First, the laws could be expanded to include the Internet. This suggestion is taken by Davies but as noted, the problem of an over-regulated system would be detrimental to many economies. Secondly, the establishment of a new international organisation to propose a set of rules appropriate for cyberspace jurisdiction would be beneficial to governments when attempting to legislate. Thirdly, these decisions need to take into account commercial entities acting as a decentralised body of various actors and stakeholder. Lastly, he proposes a treaty based international harmonisation model where rules are certain and predictable and at the same time flexible in order to ensure that the potential benefits of this technology are meaningfully consumed by individuals[14]. However, S achdeva warns that a comprehensive treaty based solution on all possible issues is an unrealistic target as the apparent youth of the Internet suggests that a number of complex issues are yet to be seen[15]. Georgios Zekos believes that new terminology, which recognizes the complexity of the Internet relationship and state, is necessary[16]. He suggests that a cyberspace jurisdiction should be used for cyberspace actions as their actions are only felt in cyberspace. Zekos proposes that cyber courts and cyber arbitral tribunals could have jurisdiction to solve all actions taking place on the net and the enforcement of their awards and decisions will be made according to international conventions on internet enforcement and e-awards[17]. Therefore, cyberspace does not owe sovereignty to any state but only to cyberspace itself. Conclusion Before adopting any model or any combination of different models, it must be remembered that the internet is here to stay, and so is the potential to commit and facilitate unlawful acts, and the resultant litigation by commercial entities or individuals. We have heard of Lessig’s argument, but have also seen acts made by the EC in order to regulate internet use. Certainly, with growing numbers of Internet users and the growth of e-commerce, more breaches of law will arise and it is for the states to find an appropriate balance between over-regulating and under-regulating the Internet. Bibliography Johnson, D.R. and Post, D. ‘Law BordersThe Rise of Law in Cyberspace† (1996) 48 Stanford Law Review Lessig, L. The Code and other laws of cyberspace 1999. New York, Basic Books. Lloyd, Ian J. Information Technology Law 5th ed. 2008. New York, Oxford University Press Reed, C. Angel, J. Computer Law: The Law and Regulation of Information Technology 6th ed. 2008. New York, Oxford Universtiy Press. Rowland, D. Macdonald, E. Information Technology Law 2008. London, Cavendish. Sachdeva, A.M. ‘International jurisdiction in cyberspace: a comparative perspective’. Computer and Telecommunications Law Review 13(8), 2007; 245-258. Zekos,G.I. ‘State cyberspace jurisdiction and personal cyberspace jurisdiction’. International Journal of Law Information Technology 15 (1) 2007. pp 1-37. Footnotes [1] David R. Johnson and David G. Post, (1996)p.1367 [2] Liability for breach of the statutorily implied terms as to the quality of goods in s.14 of the Sales of Goods Act 1979. [3] Proposed merger of MCI/Sprint and Worldcom. Case No. COMP/M.1741-MCI. [4] Lloyd (2008) p.457 [5] Lessig (1999) p.49 [6] Lessig p.50. [7] P51. He uses an example of a mandatory ‘traceability regulation’ where software could trace the user when he provides minimal level of identification. The state could then legislate, making it mandatory for banks to do business with ISPs which have traceability software. [8] Lessig p 100 [9] Rowland Macdonald (2008) p.243. [10] Lars Davies- www.scl.org/content/ecommerce, s1.3.2. Report funded by the Society for Computer and Law. [11] Rowland Macdonald (2008) p.244. [12] 2000/31/EC [13] Sachdeva (2007) p.245. [14] Ibid p. 255. [15] Ibid p.256. [16] Georgios Zekos (2007) p.2 [17] Ibid p.36.

Saturday, January 18, 2020

MBA Database Management Essay

A1. Differentiate between Traditional File System & Modern Database System File Base system were the traditional systems which has been replaced now by modern database systems. All database application are using the Modern day database management systems now a days . The difference between the these two technologies given below. File-based System File-based systems were an early attempt to computerize the manual filing system. File-based system is a collection of application programs that perform services for the end-users. Each program defines and manages its data. However, five types of problem are occurred in using the file-based approach: Separation and isolation of data When data is isolated in separate files, it is more difficult for us to access data that should be available. The application programmer is required to synchronize the processing of two or more files to ensure the correct data is extracted. Duplication of data When employing the decentralized file-based approach, the uncontrolled duplication of data is occurred. Uncontrolled duplication of data is undesirable because: i.Duplication is wasteful ii.Duplication can lead to loss of data integrity Data dependence Using file-based system, the physical structure and storage of the data files and records are defined in the application program code. This characteristic is known as program-data dependence. Making changes to an existing structure are rather difficult and will lead to a modification of program. Such maintenance activities are time-consuming and subject to error. Incompatible file formats The structures of the file are dependent on the application programming language. However file structure provided in one programming language such as direct file, indexed-sequential file which is available in COBOL programming, may be different from the structure generated by other programming language such as C. The direct incompatibility makes them difficult to process jointly. Fixed queries / proliferation of application programs File-based systems are very dependent upon the application programmer. Any required queries or reports have to be written by the application programmer. Normally, a fixed format query or report can only be entertained and no facility for ad-hoc queries if offered. File-based systems also give tremendous pressure on data processing staff, with users’ complaints on programs that are inadequate or inefficient in meeting their demands. Documentation may be limited and maintenance of the system is difficult. Provision for security, integrity and recovery capability is very limited. Database Systems: In order to overcome the limitations of the file-based approach, the concept of database and the Database Management System (DMS) was emerged in 60s. A database is an application that can store and retrieve data very rapidly. The relational bit refers to how the data is stored in the database and how it is organized. When we talk about database, we mean a relational database, in fact an RDBMS – Relational Database Management System. In a relational database, all data is stored in tables. These have the same structure repeated in each row (like a spreadsheet) and it is the relations between the tables that make it a â€Å"relational† table Advantages: A number of advantages of applying database approach in application system are obtained including: Control of data redundancy The database approach attempts to eliminate the redundancy by integrating the file. Although the database approach does not eliminate redundancy entirely, it controls the amount of redundancy inherent in the database. Data consistency: By eliminating or controlling redundancy, the database approach reduces the risk of inconsistencies occurring. It ensures all copies of the idea are kept consistent. More information from the same amount of data With the integration of the operated data in the database approach, it may be possible to derive additional information for the same data. Sharing of data Database belongs to the entire organization and can be shared by all authorized users. Improved data integrity Database integrity provides the validity and consistency of stored data. Integrity is usually expressed in terms of constraints, which are consistency rules that the database is not permitted to violate. Improved security Database approach provides a protection of the data from the unauthorized users. It may take the term of user names and passwords to identify user type and their access right in the operation including retrieval, insertion, updating and deletion. Enforcement of standards The integration of the database enforces the necessary standards including data formats, naming conventions, documentation standards, update procedures and access rules. Economy of scale Cost savings can be obtained by combining all organization’s operational data into one database with applications to work on one source of data. Balance of conflicting requirements By having a structural design in the database, the conflicts between users or departments can be resolved. Decisions will be based on the base use of resources for the organization as a whole rather that for an individual entity. Improved data accessibility and responsiveness By having an integration in the database approach, data accessing can be crossed departmental boundaries. This feature provides more functionality and better services to the users. Increased productivity The database approach provides all the low-level file-handling routines. The provision of these functions allows the programmer to concentrate more on the specific functionality required by the users. The fourth-generation environment provided by the database can simplify the database application development. Improved maintenance Database approach provides a data independence. As a change of data structure in the database will be affect the application program, it simplifies database application maintenance. Increased concurrency Database can manage concurrent data access effectively. It ensures no interference between users that would not result any loss of information nor loss of integrity. Improved backing and recovery services Modern database management system provides facilities to minimize the amount of processing that can be lost following a failure by using the transaction approach. Disadvantages In split of a large number of advantages can be found in the database approach, it is not without any challenge. The following disadvantages can be found including: Complexity Database management system is an extremely complex piece of software. All parties must be familiar with its functionality and take full advantage of it. Therefore, training for the administrators, designers and users is required. Size The database management system consumes a substantial amount of main memory as well as a large number amount of disk space in order to make it run efficiently. Cost of DBMS A multi-user database management system may be very expensive. Even after the installation, there is a high recurrent annual maintenance cost on the software. Cost of conversion When moving from a file-base system to a database system, the company is required to have additional expenses on hardware acquisition and training cost. Performance As the database approach is to cater for many applications rather than exclusively for a particular one, some applications may not run as fast as before. Higher impact of a failure The database approach increases the vulnerability of the system due to the centralization. As all users and applications reply on the database availability, the failure of any component can bring operations to a halt and affect the services to the customer seriously Q2. What is the disadvantage of sequential file organization? How do you overcome it? What are the advantages & disadvantages of Dynamic Hashing? Disadvantage of Sequestial file organization: A file that contains records or other elements that are stored in a chronological order based on account number or some other identifying data. In order to locate the desired data, sequential files must be read starting at the beginning of the file. A sequential file may be stored on a sequential access device such as magnetic tape or on a direct access device such as magnetic disk. Contrast with random file. Dynamic Hashing: Advantages The main advantage of hash tables over other table data structures is speed. This advantage is more apparent when the number of entries is large (thousands or more). Hash tables are particularly efficient when the maximum number of entries can be predicted in advance, so that the bucket array can be allocated once with the optimum size and never resized. If the set of key-value pairs is fixed and known ahead of time (so insertions and deletions are not allowed), one may reduce the average lookup cost by a careful choice of the hash function, bucket table size, and internal data structures. In particular, one may be able to devise a hash function that is collision-free, or even perfect (see below). In this case the keys need not be stored in the table . Disadvantages Hash tables can be more difficult to implement than self-balancing binary search trees. Choosing an effective hash function for a specific application is more an art than a science. In open-addressed hash tables it is fairly easy to create a poor hash function. Although operations on a hash table take constant time on average, the cost of a good hash function can be significantly higher than the inner loop of the lookup algorithm for a sequential list or search tree. Thus hash tables are not effective when the number of entries is very small. (However, in some cases the high cost of computing the hash function can be mitigated by saving the hash value together with the key.) For certain string processing applications, such as spell-checking, hash tables may be less efficient than tries, finite automata, or Judy arrays. Also, if each key is represented by a small enough number of bits, then, instead of a hash table, one may use the key directly as the index into an array of values. Note that there are no collisions in this case. The entries stored in a hash table can be enumerated efficiently (at constant cost per entry), but only in some pseudo-random order. Therefore, there is no efficient way to efficiently locate an entry whose key is nearest to a given key. Listing all n entries in some specific order generally requires a separate sorting step, whose cost is proportional to log(n) per entry. In comparison, ordered search trees have lookup and insertion cost proportional to log(n), but allow finding the nearest key at about the same cost, and ordered enumeration of all entries at constant cost per entry. If the keys are not stored (because the hash function is collision-free), there may be no easy way to enumerate the keys that are present in the table at any given moment. Although the average cost per operation is constant and fairly small, the cost of a single operation may be quite high. In particular, if the hash table uses dynamic resizing, an insertion or deletion operation may occasionally take time proportional to the number of entries. This may be a serious drawback in real-time or interactive applications. Hash tables in general exhibit poor locality of reference—that is, the data to be accessed is distributed seemingly at random in memory. Because hash tables cause access patterns that jump around, this can trigger microprocessor cache misses that cause long delays. Compact data structures such as arrays, searched with linear search, may be faster if the table is relatively small and keys are integers or other short strings. According to Moore’s Law, cache sizes are growing exponentially and so what is considered â€Å"small† may be increasing. The optimal performance point varies from system to system. Hash tables become quite inefficient when there are many collisions. While extremely uneven hash distributions are extremely unlikely to arise by chance, a malicious adversary with knowledge of the hash function may be able to supply information to a hash which creates worst-case behavior by causing excessive collisions, resulting in very poor performance (i.e., a denial of service attack). In critical applications, either universal hashing can be used or a data structure with better worst-case guarantees may be preferable Q3. What is relationship type? Explain the difference among a relationship instance, relationship type & a relation set? A3. A relationship type R among n entity types E1, E2, †¦, En is a set of associations among entities from these types. Actually, R is a set of relationship instances ri where each ri is an n-tuple of entities (e1, e2, †¦, en), and each entity ej in ri is a member of entity type Ej, 1≠¤j≠¤n. Hence, a relationship type is a mathematical relation on E1, E2, †¦, En, or alternatively it can be defined as a subset of the Cartesian product E1x E2x †¦ xEn . Here, entity types E1, E2, †¦, En defines a set of relationship, called relationship sets. Q4. What is SQL? Discuss. Q5. What is Normalization? Discuss various types of Normal Forms? Q6. What do you mean by Shared Lock & Exclusive lock? Describe briefly two phase locking protocol? MI0034 – Database Management System – 4 Credits Assignment – Set- 2 (60 Marks) Answer all the Questions Q1. Define Data Model & discuss the categories of Data Models? What is the difference between logical data Independence & Physical Data Independence? Q2. What is a B+Trees? Describe the structure of both internal and leaf nodes of a B+Tree? Q3. Describe Projection operation, Set theoretic operation & join operation? Q4. Discuss Multi Table Queries? Q5. Discuss Transaction Processing Concept? 10.2 Describe properties of Transactions? Q6. Describe the advantage of Distributed database? What is Client/server Model? Discuss briefly the security and Internet violation? .

Friday, January 10, 2020

Catherine Malasa Essay

Psychology is the scientific study of the mind and behavior. Psychology is a multifaceted discipline and includes many sub-fields of study areas such as human development, sports, health, clinical, social behavior and cognitive processes.  Because psychology is new a social science, it attempts to investigate the causes of behavior using systematic and objective procedures for observation, measurement and analysis, backed-up by theoretical interpretations, generalizations, explanations and predictions Psychology is an academic and applied discipline that involves the scientific study of mental functions and behaviors[1] with the immediate goal of understanding individuals and groups by both establishing general principles and researching specific cases,[3][4] and by many accounts it ultimately aims to benefit society. In this field, a professional practitioner or researcher is called a psychologist and can be classified as a social, behavioral, or cognitive scientist. Psychologists attempt to understand the role of mental functions in individual and social behavior, while also exploring the physiological and neurobiological processes that underlie certain cognitive functions and behaviors. Question: What Is Cognitive Psychology? Answer: Cognitive psychology is the branch of psychology that studies mental processes including how people think, acquire knowledge, perceive, learn, remember or store information and then apply it. As part of the larger field of cognitive science, this branch of psychology is related to other disciplines including neuroscience, philosophy and linguistics. Cognitive psychology studies in areas of research such as, Perception, attention, reasoning, thinking, problem solving, memory, learning, language, and emotion are areas of research. Classical cognitive psychology is associated with a school of thought known as cognitivism, whose adherents argue for an information processing model of mental function, informed by functionalism and experimental psychology. On a broader level, cognitive science is an interdisciplinary enterprise of cognitive psychologists, cognitive neuroscientists, researchers in artificial intelligence, linguists, human–computer interaction, computational neuroscience, logicians and social scientists. Computational models are sometimes used to simulate phenomena of interest. Computational models provide a tool for studying the functional organization of the mind whereas neuroscience provides measures of brain activity. The core focus of cognitive psychology is on how people acquire, process and store information. There are numerous practical applications for cognitive research, such as improving memory, increasing decision-making accuracy and structuring educational curricula to enhance learning. Until the 1950s, behaviorism was the dominant school of thought in psychology. Between 1950 and 1970, the tide began to shift against behavioral psychology to focus on topics such as attention, memory and problem-solving. Often referred to as the cognitive revolution, this period generated considerable research on topics including processing models, cognitive research methods and the first use of the term â€Å"cognitive psychology. The term â€Å"cognitive psychology† was first used in 1967 by American psychologist Ulric Neisser in his book Cognitive Psychology. According to Neisser, cognition involves â€Å"all processes by which the sensory input is transformed, reduced, elaborated, stored, recovered, and used. It is concerned with these processes even when they operate in the absence of relevant stimulation, as in images and hallucinations†¦ Given such a sweeping definition, it is apparent that cognition is involved in everything a human being might possibly do; that every psychological phenomenon is a ognitive phenomenon. † Noam Chomsky helped to launch a â€Å"cognitive revolution† in psychology when he criticized the behaviorists’ notions of â€Å"stimulus†, â€Å"response†, and â€Å"reinforcement†. Chomsky argued that such ideas—which Skinner had borrowed from animal experiments in the laboratory—could be applied to complex human behavior, most notably language acquisition, in only a superficial and vague manner. The postulation that humans are born with the instinct or â€Å"innate facility† for acquiring lan [pic] [pic] The Muller-Lyer illusion. Psychologists make inferences about mental processes from shared phenomena such as optical illusions. helped to renew interest and belief in the mental states and representations—i. e. , the cognition—that had fallen out of favor with behaviorists. English neuroscientist Charles Sherrington and Canadian psychologist Donald O. Hebb used experimental methods to link psychological phenomena with the structure and function of the brain. With the rise of computer science and artificial intelligence, analogies were drawn between the processing of information by humans and information processing by machines. Research in cognition had proven practical since World War II, when it aided in the understanding of weapons operation. [47] By the late 20th century, though, cognitivism had become the dominant paradigm of psychology, and cognitive psychology emerged as a popular branch. Assuming both that the covert mind should be studied, and that the scientific method should be used to study it, cognitive psychologists set such concepts as subliminal processing and implicit memory in place of the psychoanalytic unconscious mind or the behavioristic contingency-shaped behaviors. Elements of behaviorism and cognitive psychology were synthesized to form the basis of cognitive behavioral therapy, a form of psychotherapy modified from techniques developed by American psychologist Albert Ellis and American psychiatrist Aaron T. Beck. Cognitive psychology was subsumed along with other disciplines, such as philosophy of mind, computer science, and neuroscience, under the cover discipline of cognitive science. Cognitive psychology is the branch of psychology that studies mental processes including how people think, perceive, remember and learn. As part of the larger field of cognitive science, this branch of psychology is related to other disciplines including neuroscience, philosophy and linguistics. The core focus of cognitive psychology is on how people acquire, process and store information. There are numerous practical applications for cognitive research, such as improving memory, increasing decision-making accuracy and structuring educational curricula to enhance learning. Until the 1950s, behaviorism was the dominant school of thought in psychology. Between 1950 and 1970, the tide began to shift against behavioral psychology to focus on topics such as attention, memory and problem-solving. Often referred to as the cognitive revolution, this period generated considerable research on topics including processing models, cognitive research methods and the first use of the term â€Å"cognitive psychology. † The term â€Å"cognitive psychology† was first used in 1967 by American psychologist Ulric Neisser in his book Cognitive Psychology. According to Neisser, cognition involves â€Å"all processes by which the sensory input is transformed, reduced, elaborated, stored, recovered, and used. It is concerned with these processes even when they operate in the absence of relevant stimulation, as in images and hallucinations†¦ Given such a sweeping definition, it is apparent that cognition is involved in everything a human being might possibly do; that every psychological phenomenon is a cognitive phenomenon. † How is Cognitive Psychology Different? †¢ Unlike behaviorism, which focuses only on observable behaviors, cognitive psychology is concerned with internal mental states. Unlike psychoanalysis, which relies heavily on subjective perceptions, cognitive psychology uses scientific research methods to study mental processes. Who Should Study Cognitive Psychology? Because cognitive psychology touches on many other disciplines, this branch of psychology is frequently studied by people in a number of different fields. The following are just a few of those who may benefit from studying cognitive psychology a web site that should be useful if you are studying psychology †¢ PsychBLOG †¢ Course Content †¢ Themes †¢ Investigations Core Studies †¢ Home Top of Form [pic][pic][pic][pic] Bottom of Form Search Holah Top of Form [pic][pic][pic][pic][pic] [pic][pic][pic] Bottom of Form [pic]Core Studies †¢ Cognitive Psychology †¢ Developmental Psychology †¢ Individual Differences †¢ Physiological Psychology †¢ Social Psychology Exam Help †¢ Course Structure †¢ Exam Questions †¢ Exam Technique A Bit More Stuff †¢ About †¢ Links †¢ Further Reading [pic][pic] [pic][pic]Home ;gt; Cognitive Psychology Cognitive Psychology masters in psychology Cognitive psychology studies our mental processes or cognitions. These mental processes that cognitive psychologists focus on include memory, perception, thinking and language. The main assumption of the cognitive approach is that information received from our senses is processed by the brain and that this processing directs how we behave or at least justifies how we behave the way that we do. Cognitive processes are examples of hypothetical constructs. That is, we cannot directly see processes such as thinking but we can infer what a person is thinking based on how they act. Cognitive psychology has been influenced by developments in computer science and analogies are often made between how a computer works and how we process information. Based on this computer analogy cognitive psychology is interested in how the brain inputs, stores and outputs information. However we are much more sophisticated than computer systems and an important criticism directed at the cognitive approach is that it often ignores the way in which other factors, such as past experiences and culture influence how we process information. Loftus and Palmer’s (1974) study of eyewitness testimony demonstrates how the cognitive process of memory can be distorted by other information supplied after an event. This highlights that memory is not merely a tape recording but is a dynamic process which can be influenced by many events such as leading questions. The study also shows that memory is a dynamic process and changes to make sense of experiences. When we behave in a particular way towards another person it is likely that we attempt to understand how the other person is thinking and feeling. Baron-Cohen’s (1997) study shows that our behaviour can be influenced by a cognitive process called a theory of mind. Having a theory of mind enables a person to appreciate that other people have thoughts and beliefs that are different from their own. Baron-Cohen’s study attempts to demonstrate that the central deficit of autism is a failure to fully develop this cognitive process of a theory of mind. It has been argued that humans are unique in possessing the ability to communicate with language which involves very sophisticated cognitive skills. However this argument is challenged by the study from Savage-Rumbaugh et al. (1986) who studied the language capabilities in pygmy chimpanzees. A main strength of cognitive psychology is that this approach has tended to use a scientific approach through the use of laboratory experiments. A strength of using laboratory experiments is that they are high in control therefore researchers are able to establish cause and effect. For example Loftus and Palmer were able to control the age of the participants, the use of video and the location of the experiment. All participants were asked the same questions (apart from changes in the critical words), and the position of the key question in the second was randomised. Furthermore, such standardised experiments are easy to test for reliability. However, as many cognitive studies are carried out in laboratory settings they can lack ecological validity. When cognitive processes such as memory and theory of mind are studied in artificial situations it may be difficult to generalise the findings to everyday life. A further strength of the cognitive approach is the useful contributions that have arisen from this approach. For example, many modern types of therapy are based on the cognitive approach. Understanding cognitive processes allows us to help people to improve their cognitive processes such as memory and language. The Baron-Cohen et al. study enables us to better understand the behaviour of people with autism, Loftus and Palmers’ study highlights the limitations of eye-witness testimonies and the ape research may offer strategies to help children with language difficulties to develop language or to use strategies such as the lexigram system. Furthermore the cognitive approach has become the dominant approach in psychology particularly since it has become allied with neurology. The cognitive approach nowadays is often called cognitive science and is able to provide a very sophisticated understanding of how the brain processes information. A weakness of the cognitive approach relates to the validity of measuring cognitive processes. We can only infer what a person is thinking and therefore the cognitive approach relies heavily on self report measures and observation. There are a number of reasons why we have to question the validity of self report measures and observation. For example we can only infer that adults with autism have theory of mind difficulties from the results of the Eyes Task or that pygmy chimps are really using language when they communicate through a Lexigram. However, because of the developments of brain scanning techniques we are able to record the active parts of the brain more accurately nowadays and cognitive science is providing a more and more detailed description of how cognitive processes work. For example, brain scanning techniques are giving great insights about how memory works. It has been argued that a weakness of the cognitive approaches reliance on the computer analogy leads to a reductionist and mechanistic description of experiences and behaviour. Reductionism is the idea that complex phenomena can be explained by simpler things. The cognitive approach often takes this narrow focus and ignores social and emotional factors which may impact on cognition. For example, the autism study investigated just one central cognitive deficit as an explanation for autism. However the reductionist approach does have strengths. An advantage of the reductionist view is that by breaking down a phenomenon to its constituent parts it may be possible to understand the whole. This type of single mindedness has lead to some great discoveries in psychology as it has in the ‘natural’ sciences.

Thursday, January 2, 2020

It 205 Assignment Week 8 - 1072 Words

Hardware Replacement Project Matthew Sager IT_205 Feb. 20, 2011 Lori Atkins Mikalonis Hardware Replacement Project There are five major variables to consider when starting a major IT projects and there are scope, time, cost, quality, and risk. Most major IT projects will require a project manager to handle to overseeing of the project. The project management refers to the application of knowledge, skills, tools, and techniques to achieve specific targets within specified budget and time constraints. Project managers activities will include the planning of the work, assessing the risk, estimating the costs required to complete the project, and several other important duties. As in other areas of business, Project management for†¦show more content†¦This budget should also include a little extra funding to cover any type of unexpected expense that can arise. Quality is an indicator of how well the end result of the project satisfies the project management. All of the objectives specified by the project management must meet and exceed their expectations. The quality for our CRM system project will usually boil down to improved national marketing campaigns, instant access to customer data, simplified account management, enhanced lead and sales tracking, and opportunity for additional sales to current customers. The quality can also be considered by the accuracy in the scheduling and timeliness that our project was completed. The accuracy of information that is produced by our new CRM system and ease of use is another way to judge its quality. There is also the quality of hardware and software used in the project this can determine the overall quality of our new CRM system. If we used low end equipment and cheap software produced by an unknown company, and then dramatically decrease the quality of the CRM system. This will also lower t he quality of information that is given from the system as well. Risk refers to any potential problems that would threaten the likelihood of success for or any project. These potential problems might prevent a project from achieving some or all of its objectives by increasing time and cost. Risk factors can evenShow MoreRelatedWeek Assignment Review 1 Substance Abuse Class1710 Words   |  7 PagesBrian Edwards Professor Haley Nunn SOCL4273 11, January 2015 Chapter Review Assignment 6,7,8,9 Week 1 Chapter 6 1. At about what periods in history did cocaine reach its first and second peaks of popularity, and when was amphetamine’s popularity at its highest? Cocaine -late 19th century and early 20th amphetamine- 1960s (Hart Ksir, p. 125) 2. How did Mariani, Freud Halsted popularize the use of cocaine? Psychiatric use (Hart Ksir, p. 126) 3. How are coca paste, freebase, crack, and iceRead MoreEssay about Assignment1252 Words   |  6 PagesSeoul National University Supply Chain Management May 11 – May 22, 2015 Homework Assignment Completed assignment is due at the start of Session 7 on Tuesday, May 19. This assignment is to be completed in a group of 2 or 3 students. Submit a hard copy, not an electronic copy (typed answers are preferred). Show your work clearly and explain your reasoning in detail. You will not receive a credit if the instructor cannot understand what you have done because of insufficient explanation. (Q1-Q2)Read More Therapeutic Horseback Riding and Children with Autism Developmental Disorders1659 Words   |  7 Pagesprovided by anyone who receives specialized training and certification. Equine-assisted therapy (EAT), a subtype of animal-assisted therapy, is the integration of the horse into goal-directed treatment and is provided by licensed therapist. (p. 205) Hippotherapy is a specialized type of equine-assisted therapy (EAT). It means involving a horse during the course of treatment. In order to conduct hippotherapy, one must be a licensed therapist (physical, occupational, or speech-language pathologist)Read MoreCrosswell1474 Words   |  6 PagesCommitment to reading and understanding of text and other assigned materials. 2. Understanding of all case assignments – with or without written reports. 3. Active and informed participation in class discussions. 4. Timely and effective completion of assignments. 5. Energetic and effective involvement in team projects, especially the International Finance Case Study. 2 Class discussion of assignments A principal obligation is to keep up with the assigned chapters within the course study outline containedRead MoreHardware Replacement Project1364 Words   |  6 PagesAssignment Week 8: Hardware Replacement Project Melanie Sexton IT/205 May 8, 2011 Charbel Elkhoury, Ph.D. Assignment Week 8: Hardware Replacement Project Prior to the IT department’s implementation of a new Customer Relationship Management (CRM) solution to its corporate offices, assessment of company’s hardware currently in use and its ability to support the new CRM application revealed the information technology infrastructure contained out-of-date hardware in need of replacementRead MorePm586 Final Study Guide1407 Words   |  6 PagesRemember, though, that the exam timer continues to run while students are disconnected, so students should try to re-login as quickly as possible. The Help Desk cannot grant any student additional time on the exam. 3. See Syllabus Due Dates for Assignments amp; Exams for due date information. 4. Reminders: * You will only be able to enter your online Final Exam one time * Click the Save Answers button often * If you lose your Internet connection during your Final Exam, logon againRead Morelaw case Essay1608 Words   |  7 Pagesï » ¿ Commerce 2603 Summer, 2013 Assignment Case Distribution Date: May 15. 2013 Jim Jones was a B.Comm student at Ivory Tower University in Halifax, and played hockey for the Commerce A11-Stars† in the ITU inter-faculty league. Bob Black, of the ITU Law School, played for the Law School Lumpers in the same league. One evening, Jim checked Bob heavily into the boards during a game. Bob retaliated, as he fell, by slashing Jim across the back of the neck with his hockey stick. TheRead MoreNegative Effects of Technology on Children1580 Words   |  7 PagesTechnology on Children March 21, 2010 According to a New York Times article this January, the average kid, ages 8-18, spends over 7  ½ hours a day using technology gadgets equaling 2  ½ hours of music, almost 5 hours of tv and movies, three hours of internet and video games, and just 38 minutes of old fashioned reading according to the Kaiser Family Foundation, which adds up to 75 hours a week! These statistics are not just mere numbers; they are a reflection of the way our society is heading. There isRead MoreFinancial Accounting1584 Words   |  7 PagesWeek 2 I)Frontier Park was started on April 1 by C.J Mendez and associates. The following selected events and transactions occurred during April. April 1 Stockholders invested $40,000 cash in the business in exchange for common stock. 4 Purchased land costing $30,000 for cash. 8 Incurred advertising expense of $1,800 on account. 11 Paid salaries to employee $1,500. 12 Hired park manager at a salary of $4,000 per month, effective May 1. 13 PaidRead MoreManagement Case Study4811 Words   |  20 PagesQuality Management Case Study (Assignment 1) CE00783-M : Quality Project Management for Technology JAYASOORIYA, SAVEEN MANILKA BANDARA Reg. No: 09003656 th Date: 26 April 2010 Tutor: Dave Link Faculty of Computer Engineering and Technology K215 Beacon Building Staffordshire University CE00783-M Quality Management Case Study (Assignment 1) CONTENTS 1. Introduction 1.1 Total Quality Management 1.2 Aims and Objectives 2. Background 2.1 Organization 2.2 Product 2.3 Production