IJCRR - 4(23), December, 2012
Pages: 131-141
Date of Publication: 15-Dec-2012
Print Article
Download XML Download PDF
PERFORMANCE MEASUREMENT OF PAN
Author: Samiksha Nikam, B.T. Jadhav
Category: Technology
Abstract:Performance evaluation of computer system and network is an essential task. It helps to determine how the system is performing and weather any upgrading are needed to improve the performance. To measure the performance of system three techniques are used i.e. empirical measurement, analytical and simulation technique. Once a system has been built and is running its performance can be evaluated using the empirical measurement techniques. During design and development phase analytical or simulation techniques are used. Result Tool have been design and developed as a case study for this study. Developed Result tool is used for result sheet and mark sheet preparation with various report required at University level and College level. As Result tool is already developed, we used empirical measurement method which consists of three steps i.e. Selection of performance parameters, choice of measurement tool and design of experiment. Design and Development of result tool and empirical measurement technique explained in this paper. Appropriate network and Result tool performance parameters were selected and set with performance tool. Performance was measured using windows XP's built in performance tool. Experimental setups
were implemented to execute Result tool in PAN. We found performance of PAN is considerable. Developed tool is user friendly and does not create any additional load in the system.
Keywords: Network Performance, PAN, Windows Performance Tool, Result tool, Network Bandwidth
Full Text:
INTRODUCTION
Bluetooth network is low cost, low power wireless communication technology. It uses a fast frequency-hopping spread spectrum (FHSS) and operates in the unlicensed Industrial ScientificMedical (ISM) band at 2.4 GHz [16]. A personal area network consists of mater slave model. Master slave model provides simplicity for connection. However communication always takes place between slave and master. Slave to slave direct communication is not possible. Two slaves can communicate to each other through master. This adds extra data bit for master to carry out communication. Before data transmission starts connection has to be established between the devices. Bluetooth has the advantage of a standardized way of obtaining the MAC address of new nodes in an ad hoc fashion by using the INQUIRY procedure [16]. The basic structure for communication in a Bluetooth network is the piconet. A piconet contains one master node and up to 7 active slave nodes [2][4]. All transmissions among Bluetooth devices in the same piconet are supervised by the master node operating over a channel-hopping sequence generated from the master’s Bluetooth device address at a rate of 1,600 hops per second [16]. A result tool was developed for college and university exam. Tool was initially tested on standalone machine. We found that the scalability of developed tool is considerable. Resources like processor, memory and hard disk were not used up to optimum level. But as data size is going to increase resources will be used optimally [21]. A variety of simulation tools like Ns-2, Netsim, OPENET are available for the purpose of modulation and simulation but the choice of simulator depends upon the features available and requirement of application [18]. As developed result tool is tested in PAN environment. We used windows XP’s built in performance tool to evaluate the performance. It helps to check various components of computer system and stores output as (text, HTML, excel) data and display information in different ways. It is capable to monitor servers, workstations, and network. Windows XP’s built in performance tool is the cost effective solution to measure network performance.
SYSTEM DEVELOPED
System development life cycle (SDLC) is a process used during the development of any system. SDLC consists of four main phases: analysis, design, implement and testing. During analysis phase, context diagram and data flow diagrams are used to produce the process model of a system. In system development life cycle (SDLC), a system model can be developed by using Data Flow Diagram (DFD). DFD is graphical diagrams for specifying, constructing and visualizing the system. DFD is used in defining the requirements in a graphical view [3], [20]. DFD’s are easily understands by technical and nontechnical users. It shows the flow of data from external entities into the system, showed how the data moved from one process to another, as well as its logical storage. In data flow diagram, the highest-level view of the system is known as context diagram [10], [19]. It is common practice for a designer to draw the context level DFD first, which shows the interaction between the system and external agents which act as data sources and data sinks. Components of a System Context Diagram are shown in Figure 1.1 Context Diagram represents entities such as,
- University
- Clerk
- Principal
- Student
- Exam Department
- Teacher
These entities interact with result system. Entities on the left hand side supply information to the system and after processing that information result is send to specific entities mentioned on the right hand side. It represents highest level view of the system [3]. The context diagram shows the entire result system as a single process, and gives no clues as to its internal organization. The purpose of a result system context diagram is to focus attention on external factors and events that should be considered in developing a complete set of system requirements and constraints. System context diagram can be further expanded to detailed designing of result tool.
PERFORMANCE EVALUATION METHODOLOGY
Performance can be evaluated using the empirical measurement, analytical modelling and simulation techniques. Once a system has been built and is running its performance can be evaluated using the empirical measurement techniques. During design and development phase it is necessary to use analytical or simulation technique [2, 5, 15]. As result tool is already developed we used empirical measurement method. This method consists of following steps:
1. Decide parameters to measure.
2. Choice of measurement tool.
3. Design of measurement experiment.
Performance measurement metrics classification and selection
Network Performance Parameters
Network performance measurement parameters (NPM’s) mean the basic metric for performance. We categories these parameters into four groups [1,8]. ? Availability ? Utilization ? Loss ? Delay Availability of network means connectivity and functionality of network. It evaluates robustness of the network, i.e. percentage of time network running without failure. Specific network element can be observed like link or node to observed amount of time they running without failure. Loss is fraction of packet loss during transition from source to destination within specific time interval. Loss is expressed in percentage. It indicate congestion, transmission error and device malfunctioning. Delay metrics indicate responsiveness of the network. It can be measure as one way delay, roundtrip time and jitter. Utilization metric measure capacity of communication link i.e. amount data can be pass through network in unit time [1, 4, 8].
Result Too Performance Parameters
There are two important dimensions to software performance timeliness; Responsiveness and Scalability. Responsiveness is ability of software to meet its objective for response time or throughput .The response time is the number of events processed in some interval of time. Scalability is the ability of software to continue to meet its response time or throughput objective as the demand for the software function increases [11, 25,27]. The scopes of performance include responsiveness and scalability [21]. In software engineering, performance testing is testing that is performed to determine how fast some aspect of a system performs under a particular workload. It can also serve to validate and verify other quality attributes of the system such as scalability, reliability and resource usage [14, 15, 21,23]. Resource sharing is very common in computer system. Resources are any physical or logical entity that software needs for its execution [25]. Processor, memory and hard disk are the three basic components which affect the performance of a computer system [21]. Developed Result tool’s response time or throughput is considerable. Hence here we concentrate on scalability and resource usage parameters to measure performance of Result Tool. Three major components which affect scalability and resource usage capacity of computer system are Speed and Power of its processor, how much memory that it has, and the performance of its disk subsystem [21].
Measurement Tool
To evaluate performance of PAN following counters were selected. Parameter Availability was not considered for measuring performance. Since tool had to be tested in PAN environment; it was assumed that the network was always available for the user. We had created a counter log file which contained counters shown in Table1 . During experiment counter log file stored performance data at the specified interval for defined counters.
Counter used
1. Current Bandwidth: It is an estimate of the current bandwidth of the network interface in bits per second (BPS). For interfaces that do not vary in bandwidth or for those where no accurate estimation can be made, this value is the nominal bandwidth [25].
2. Packets Received/sec: It is the rate at which packets are received on the network interface [25].
3. Packets Sent/sec: It is the rate at which packets are sent on the network interface [25].
4. Output Queue Length: It is the length of the output packet queue (in packets). If this is longer than two, there are delays and the bottleneck found [25].
5. Available Memory: It is the amount of physical memory available to processes running on the computer, in bytes [25].
6. Pages/ Sec: It is the rate at which pages are read from or written to disk to resolve hard page faults [25].
7. Available Disk Queue Length: It is the average number of both read and writes requests that were queued for the selected disk during the sample interval [25].
8. % Processor Time: It is the percentage of elapsed time that the processor spends to execute a non-Idle thread [25]. Experimental setup Experiments had been carried out in Personal area Network. Experimental setup is shown in fig. 2. Bluetooth USB Dongles V2.0 & V1.2 compliant were used to form Bluetooth network. In Bluetooth network any machine could be master or slave. Communication always takes place between any two machines where one is master and other is slave. During experiment computer with specification Intel Core™2 CPU4400 @2.00 GHz, 1GB of RAM, operating system windows 2003 were treated as master. Computers with specification Intel Core™2 CPU4400 @2.00 GHz, 500 MB of RAM, operating system windows XP were treated as slave. Client copy of developed result tool was installed on slave machine. Various transactions like data fetch and store were performed by users. All transaction were handled through Master Machine database. Counters specified in table 1 were set with Master and slave machine’s performance tool. Various operations like subject registration, exam registration, student registration etc. were performed by users. Throughout the experiment data was collected simultaneously in counter log files .Log file can be manually on or off. Performance data was stored at the location C:/PertLogs. Collected performance data was exported to the Excel file with procedure as shown in figure 3.
Experimental data collection
During experiment counter log file was stored values for network performance at the interval of five second for defined counters. We recorded sixty observations spans for 300 sec duration. Sample readings are shown in Table 2. Experimental performance data for PAN is shown in Table 2. Experimental Performance data for Result tool is shown in Table 3. Counter log file stored data for measuring performance of the tool at the interval of 15 second for defined counters. We recorded data for 300 sec duration.
Performance analysis
Performance analysis of the result tool is divided into three basic steps:
1) Data Collection
2) Data Transformation
3) Data visualization
Data was collected in log file during execution of the result tool for a specified counter shown in Table 1. Log file can be manually on or off. Performance data was stored at the location C:/PertLogs. Collected performance data was exported to the Excel file. Data transformation techniques were used to reduce size of experimental data. For Data visualization Microsoft Excel built in graph tool was used. Experimental performance data is presented in graphical format against sampled interval or time.
Performance Analysis of PAN
Bandwidth specifies data transfer capacity of a network. Throughout the experiments Bandwidth available was 1MB as shown in fig 4. In Bluetooth network before transferring data connection has to be established between master and slave. In other words we can say that data transmission in Bluetooth network is connection oriented. Hence there are very less chances of data delays and losses. We noted from output Queue Length counters values that for all transitions packet queue contains no packet waiting for transmission, hence there are no delays and the traffic jam found while using tool in Bluetooth area network. Fig 5 and fig 6 depicts graphical representation of packet sent and received in a piconet.
Performance Analysis of Result Tool
We selected parameters scalability and Resource usage to measure performance of Result Tool. Graphical representation of fig 7 shows memory available to execute other applications is about 150 MB. Graphical Representation of fig 8 shows pages transfer through main memory while performing various operations with Result Tool. We noted that no heavy pages (data) transfer during the operation. Fig 9 shows Average Disk Queue counter value is less than 0.7. Hence it is noted that no significant transition through hard disk. From graphical representation of fig 10 we illustrate that processor consumption is approximately 4-5%.
CONCLUSION
Developed Result Tool is executed in Personal area Network. Performance of network is considerable. Only two computers can communicate at a time and no packets in waiting queue hence very less chances of delay and data loss. Bluetooth are identified for low power consumption as Processor consumption is only 4- 5%. This is an added advantage compared with other networks.Performance of the Developed Result tool is considerable in Personal area Network. No heavy data transfer through memory and hard disk drive. Sufficient memory is available for other applications to execute. The developed result tool is user friendly.
ACKNOWLEDGEMENT
Authors acknowledge the immense help received from the scholars whose articles are cited and included in the reference of this manuscript. The authors are also grateful to authors / editors / publishers of all those articles, journal and books from where the literature for this article has been reviewed and discussed.
References:
1. Andres Hanemann, Athanassios Liakopoulos, Maurizio Molina,D.Martin Swany, “A Study On Network Performance Metrics and their Composition” http://marco.uminho.pt/~dias/MIECOM/GR/ Projs/P4/TNC_Metric_Comp_FWork-fullv4.pdf
2. Andrew S. Tanenbaum – Computer Networks – Fourth Edition; PHI 2008.
3. Awad Elias M. -Structured System Analysis and Design – Second edition: Galgotia
4. Behrouz A Forouzan – Data Communications and Networking – Fourth Edition; TataMcGraw- Hill.2006.
5. Claus Pahl,Marko Boskovic, Wilhem Hasselbring “Model Driven Performance Evaluation for Service Engineesing”. ceurws.org/Vol-313/paper11.pdf
6. Craig Zacker. – Networking The Complete Reference – Fourteenth rprint; TataMcGrawHill.2001.
7. David D. CLARK member IEEE,proceedings of The IEEE,vol 66,No 11,Nov 1978
8. Hyo –Jin Lee, Myung- Sup Kim and James W.Hong,Gil – Haeng Lee,” Qos Parameters and Network Performance Metric Mapping for SLA Monitoring. http://www.knom.or.kr/knomreview/v5n2/4.pdf
9. Ibrahim K. El-Far and James A. Whittaker,” Model-based Software Testing”, Encyclopedia on Software Engineering (edited by J.J.Marciniak), Wiley, 2001.
10. Jalani Atif A.A, Usman Mohammad , Nadeem Amer,”Comparative Study on DFD to UML Designs Transformations”, world of computer science and information technology,ISSN 2221-0741,vol 1,No 1,10- 16,2011
11. Jiantao Pan,”Software Reliability” Spring 1999.
12. K.S.R Anjareyuly., John.R. Anderson,”The advantage of Data Flow Diagrams for Beginning Programming”, [2008].Paper 19 http://repository.cmu.edu/psychology/
13. Manish Jain, Constantinos Dovrolis,” End to End Available Bandwidth:Measurment Methodology, Dynamics and Relation with TCP Throughput”, In proceedings of ACM SIGCOMM, pages 19-23 ,August 2002.
14. Mohd. Ehmer Khan,” Different Forms of Software Testing Techniques for Finding Errors”, IJCSI International Journal of Computer Science ISSN (Online): 1694- 0784ISSN (Print): 1694-0814, Vol. 7, Issue 3, No 1, May 2010.
15. Omer Nauman Mirza,”Software Performance Evaluation using UML-PSI “,Master Thesis, IT University of Goteborg,2007.
16. P. Johansson, 1 R. Kapoor, 1 M. Kazantzidis, 1 and M. Gerla1,” Personal Area Networks: Bluetooth or IEEE 802.11?”, International Journal of Wireless Information Networks, Vol. 9, No. 2, April 2002 (© 2002).
17. Savan K. Patel, Hiral R. Patel, Ravi S. Patel,”Bluetooth Usage with Architecture View and Security Measures”, International Journal of P2P Network Trends and Technology,volume 1 issue3 ,ISSN 2249- 2615,2011
18. Rahul Malhotra, Vikas Gupta,Dr. R.K.Bansal, “simulation and performance analysis of wired and wireless computer networks”, International Journal of Computer Application(0975-8887),volume 14-no7,Feb 2011.
19. Rosziati Ibrahim, Siow Yen Yen,” An Automatic Tool for Checking Consistency between Data Flow Diagrams DFDs)”, World Academy of Science, Engineering and Technology 69, 2010.
20. Rosziati Ibrahim and Siow Yen Yen,” Formalization of the data flow diagran rules for consistency check”, International Journal of Software Engineering & Applications (IJSEA), Vol.1, No.4, October 2010.
21. Samiksha Nikam, B.T. Jadhav ,”Design and development of result tool for college & university exam and it’s performance study,International journal on computer science and engineering,volume 3 issue 11,2011,pg 2269-2276.
22. Thesis “Network Monitoring and Diagnosis Based on Available Bandwidth Measurement” ,Ningning Hu,CMU-CS -06- 122 ,May 2006, School of computer science computer science department, Carnegie Mellon University , Pittsburgh,PA 15213
23. Thomas Thelin,”Automated Statistical Testing Suite for Software Validation”
24. V.Babka, P. Tuma,”Effect of Memort Sharing on Contemporary Processor Architecture”, MEMICS’07, Znojmo, Czech Republic, 2007.
25. V. Babka. ,“Resource Sharing in QPN-based Performance Models “, WDS’08 Proceedings of Contributed Papers, Part I,202-207,2008
26. Walter Glenn, Tony Northrup Installing, Configuring and administering Microsoft WINDOWSXP PROFESSIONAL – self paced training kit 2005 edition,
27. Xiang Gan,”software Performance Testing”,Seminar Paper ,University of Helsinki,26-9-2006
|