Search results

1 – 10 of 180
Article
Publication date: 17 October 2022

Santosh Kumar B. and Krishna Kumar E.

Deep learning techniques are unavoidable in a variety of domains such as health care, computer vision, cyber-security and so on. These algorithms demand high data transfers but…

53

Abstract

Purpose

Deep learning techniques are unavoidable in a variety of domains such as health care, computer vision, cyber-security and so on. These algorithms demand high data transfers but require bottlenecks in achieving the high speed and low latency synchronization while being implemented in the real hardware architectures. Though direct memory access controller (DMAC) has gained a brighter light of research for achieving bulk data transfers, existing direct memory access (DMA) systems continue to face the challenges of achieving high-speed communication. The purpose of this study is to develop an adaptive-configured DMA architecture for bulk data transfer with high throughput and less time-delayed computation.

Design/methodology/approach

The proposed methodology consists of a heterogeneous computing system integrated with specialized hardware and software. For the hardware, the authors propose an field programmable gate array (FPGA)-based DMAC, which transfers the data to the graphics processing unit (GPU) using PCI-Express. The workload characterization technique is designed using Python software and is implementable for the advanced risk machine Cortex architecture with a suitable communication interface. This module offloads the input streams of data to the FPGA and initiates the FPGA for the control flow of data to the GPU that can achieve efficient processing.

Findings

This paper presents an evaluation of a configurable workload-based DMA controller for collecting the data from the input devices and concurrently applying it to the GPU architecture, bypassing the hardware and software extraneous copies and bottlenecks via PCI Express. It also investigates the usage of adaptive DMA memory buffer allocation and workload characterization techniques. The proposed DMA architecture is compared with the other existing DMA architectures in which the performance of the proposed DMAC outperforms traditional DMA by achieving 96% throughput and 50% less latency synchronization.

Originality/value

The proposed gated recurrent unit has produced 95.6% accuracy in characterization of the workloads into heavy, medium and normal. The proposed model has outperformed the other algorithms and proves its strength for workload characterization.

Details

International Journal of Pervasive Computing and Communications, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 19 April 2022

D. Divya, Bhasi Marath and M.B. Santosh Kumar

This study aims to bring awareness to the developing of fault detection systems using the data collected from sensor devices/physical devices of various systems for predictive…

1764

Abstract

Purpose

This study aims to bring awareness to the developing of fault detection systems using the data collected from sensor devices/physical devices of various systems for predictive maintenance. Opportunities and challenges in developing anomaly detection algorithms for predictive maintenance and unexplored areas in this context are also discussed.

Design/methodology/approach

For conducting a systematic review on the state-of-the-art algorithms in fault detection for predictive maintenance, review papers from the years 2017–2021 available in the Scopus database were selected. A total of 93 papers were chosen. They are classified under electrical and electronics, civil and constructions, automobile, production and mechanical. In addition to this, the paper provides a detailed discussion of various fault-detection algorithms that can be categorised under supervised, semi-supervised, unsupervised learning and traditional statistical method along with an analysis of various forms of anomalies prevalent across different sectors of industry.

Findings

Based on the literature reviewed, seven propositions with a focus on the following areas are presented: need for a uniform framework while scaling the number of sensors; the need for identification of erroneous parameters; why there is a need for new algorithms based on unsupervised and semi-supervised learning; the importance of ensemble learning and data fusion algorithms; the necessity of automatic fault diagnostic systems; concerns about multiple fault detection; and cost-effective fault detection. These propositions shed light on the unsolved issues of predictive maintenance using fault detection algorithms. A novel architecture based on the methodologies and propositions gives more clarity for the reader to further explore in this area.

Originality/value

Papers for this study were selected from the Scopus database for predictive maintenance in the field of fault detection. Review papers published in this area deal only with methods used to detect anomalies, whereas this paper attempts to establish a link between different industrial domains and the methods used in each industry that uses fault detection for predictive maintenance.

Details

Journal of Quality in Maintenance Engineering, vol. 29 no. 2
Type: Research Article
ISSN: 1355-2511

Keywords

Article
Publication date: 3 March 2022

Santosh Kumar B. and Krishna Kumar E.

In real-time entertainment processing applications, processing of the multiple data streams demands high efficient multiple transfers, which leads to the computational overhead…

Abstract

Purpose

In real-time entertainment processing applications, processing of the multiple data streams demands high efficient multiple transfers, which leads to the computational overhead for system-on-chip (SoC), which runs the artificial intelligence algorithms. High-performance direct memory access controller (DMAC) is incorporated in SoC to perform the multiple data transfers without the participation of main processors. But achieving the area-efficient and power-aware DMAC suitable for streaming the multiple data remains to be a daunting challenge among the researchers.

Design/methodology/approach

The purpose of this paper to provide the DMA operations without intervention of central processing unit (CPU) for bulk video data transmissions.

Findings

The proposed DMAC has been developed based on the hybrid advanced extensible interface (AXI)-PCI bus subsystem to handle the multiple data streams from the video sources. The proposed model consists of bus selector module, user control signal, status register, DMA-supported address and AXI-PCI subsystems to achieve better performance in analysing the video frames.

Originality/value

The extensive experimentation is carried out with Xilinx Zynq SoC architecture using Very High Speed integrated circuit hardware description language (VHDL) programming, and performance metrics such as utilization area and power are calculated and compared with the other existing DMA controllers such as Scatter-DMA, Gather-DMA and Enhanced DMA. Simulation results demonstrate that the proposed DMAC has outperformed other existing DMAC in terms of less area, less delay and power, which makes the proposed model suitable for streaming multiple video streams.

Details

International Journal of Pervasive Computing and Communications, vol. 18 no. 3
Type: Research Article
ISSN: 1742-7371

Keywords

Article
Publication date: 24 May 2011

G. Anand, Rambabu Kodali and B. Santosh Kumar

Selection of material handling systems (MHS) is an important decision to be taken during the design of flexible manufacturing systems (FMS) as it affects the layout of FMS. Many…

Abstract

Purpose

Selection of material handling systems (MHS) is an important decision to be taken during the design of flexible manufacturing systems (FMS) as it affects the layout of FMS. Many researchers have addressed this issue of MHS selection in the domain of operations management, while a few of them have addressed this issue in the domain of FMS. However, none of them have modelled this problem by incorporating the relationship/dependencies that exist between various factors/attributes/criteria/elements (in short, it will be called “factors” for the sake of simplicity). The purpose of this paper is to demonstrate the development of the analytic network process (ANP) for the selection of MHS in the design of FMS for a hypothetical case organisation.

Design/methodology/approach

As mentioned above, selection of MHS in design of FMS is a complex decision‐making problem, as it is dependent on many factors. Hence, one of the recently developed multi‐attribute decision‐making (MADM) models – namely the ANP — is utilised as it has the capability to incorporate the relationship that exists between and within different factors. To demonstrate the application of ANP, a hypothetical case situation is presented.

Findings

The results obtained from ANP revealed that conveyor is a better alternative for the FMS under the given case situation. Furthermore, this study also revealed the computational complexity of the ANP, albeit it is successful implementation of dependency/relationships between the factors within the decision‐making process.

Practical implications

It is believed that this paper will enable the practitioners to appreciate the role of ANP in the strategic decision‐making process, apart from helping them understand how decisions can be made in a structured manner. However, it should be understood that although ANP can provide adequate support to the decisions being made, it requires the experience and judgements of the decision makers to arrive at a particular decision. Originality/value According to the authors' knowledge, no paper exists in the literature that demonstrates the application of ANP, specifically for selecting a MHS during the design of FMS by considering 35 or more factors. Furthermore, the paper attempts to model this problem by incorporating the relationship/dependency that exist between these factors, which is unique when compared to those papers that have already dealt with this problem.

Details

Journal of Advances in Management Research, vol. 8 no. 1
Type: Research Article
ISSN: 0972-7981

Keywords

Article
Publication date: 30 September 2020

Hossein Derakhshanfar, J. Jorge Ochoa, Konstantinos Kirytopoulos, Wolfgang Mayer and Craig Langston

The purpose of this research is to identify the most impactful delay risks in Australian construction projects, including the associations amongst those risks as well as the…

1077

Abstract

Purpose

The purpose of this research is to identify the most impactful delay risks in Australian construction projects, including the associations amongst those risks as well as the project phases in which they are most likely present. The correlation between project and organisational characteristics with the impact of delay risks was also studied.

Design/methodology/approach

A questionnaire survey was used to collect data from 118 delayed construction projects in Australia. Data were analysed to rank the most impactful delay risks, their correlation to project and organisational characteristics and project phases where those risks are likely to emerge. Association rule learning was used to capture associations between the delay risks.

Findings

The top five most impactful delay risks in Australia were changes by the owner, slow decisions by the owner, preparation and approval of design drawings, underestimation of project complexity and unrealistic duration imposed to the project, respectively. There is a set of delay risks that are mutually associated with project complexity. In addition, while delay risks associated with resources most likely arise in the execution phase, stakeholder and process-related risks are more smoothly distributed along all the project phases.

Originality/value

This research for the first time investigated the impact of delay risks, associations amongst them and project phases in which they are likely to happen in the Australian context. Also, this research for the first time sheds light on the project phases for the individual project delay risks which aids the project managers to understand where to focus on during each phase of the project.

Details

Engineering, Construction and Architectural Management, vol. 28 no. 7
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 25 June 2019

Hossein Derakhshanfar, J. Jorge Ochoa, Konstantinos Kirytopoulos, Wolfgang Mayer and Vivian W.Y. Tam

The purpose of this paper is to systematically develop a delay risk terminology and taxonomy. This research also explores two external and internal dimensions of the taxonomy to…

1101

Abstract

Purpose

The purpose of this paper is to systematically develop a delay risk terminology and taxonomy. This research also explores two external and internal dimensions of the taxonomy to determine how much the taxonomy as a whole or combinations of its elements are generalisable.

Design/methodology/approach

Using mixed methods research, this systematic literature review incorporated data from 46 articles to establish delay risk terminology and taxonomy. Qualitative data of the top 10 delay risks identified in each article were coded based on the grounded theory and constant comparative analysis using a three-stage coding approach. Word frequency analysis and cross-tabulation were used to develop the terminology and taxonomy. Association rules within the taxonomy were also explored to define risk paths and to unmask associations among the risks.

Findings

In total, 26 delay risks were identified and grouped into ten categories to form the risk breakdown structure. The universal delay risks and other delay risks that are more or less depending on the project location were determined. Also, it is realized that delays connected to equipment, sub-contractors and design drawings are highly connected to project planning, finance and owner slow decision making, respectively.

Originality/value

The established terminology and taxonomy may be used in manual or automated risk management systems as a baseline for delay risk identification, management and communication. In addition, the association rules assist the risk management process by enabling mitigation of a combination of risks together.

Details

Engineering, Construction and Architectural Management, vol. 26 no. 10
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 7 August 2017

Adarsh Anand and Gunjan Bansal

The “quality” of any product or service defines the agility of the product and its life cycle in dynamic environment. The demand of high “quality” becomes an imperative concern…

Abstract

Purpose

The “quality” of any product or service defines the agility of the product and its life cycle in dynamic environment. The demand of high “quality” becomes an imperative concern, when “software” is acting as a product or a service. Since the nature of the software is intangible and more complex, therefore the assurance of providing accurate results is anxiety for companies. The overall quality of the software is based upon many individual factors (or attributes) that makes software reliable, inclined and a long-lasting product in the marketplace. But how these factors can influence each other is significant to identify. Therefore, the purpose of this paper is to study the quality aspect of the software and analyse the interrelationship of impactful attributes.

Design/methodology/approach

The analysis has been done through responses sought from software development teams/clients in India. The questionnaire related to the software quality was administered to the sample population. Interconnection among impactful characteristics has been analysed by using a qualitative technique called interpretive structural modelling (ISM). The driving and dependency of the attributes under consideration has been classified using cross-impact matrix multiplication applied to classification (MICMAC) analysis. The procedure of applying ISM method has been automated and provided it as package “ISM” using R software.

Findings

In general, it is very complex job to determine the most impactful attribute of software quality. By applying ISM and MICMAC analysis on the set of attributes under consideration, it has been found that “reliability” along with “usability” and “performance” is the most influential attribute of software quality and preferred most.

Research limitations/implications

Though ISM provides an organized modelling framework yet its results are considered less statistically significant. Therefore, it would be interesting to concatenate the present findings with the findings of any analytical methodology; which gives statistically significant results.

Practical implications

The present proposal deals with the interpretation of the software quality attributes and their contextual relationship but with more effective and efficient manner. It can help management to understand the complexity of relationship amongst attributes (which are quality attributes here) more accurately and precisely. Since today is an era of automation, the manual part is being substituted so as to reduce the labour cost, improve safety, security and product quality to increase production. This study is, therefore, an effort and a helping hand in making the hassle free calculations for obtaining intermediate matrices and doing eventual calculations.

Social implications

n numbers of parameters can be selected to analyse the interrelationship of any project/study. Eradication human errors in applying transitivity law or applying any other operation in solving problem. The package created here can save precious time of users. Provides well-formatted and readable excel output files that make interpretation easier.

Originality/value

Software is one such product/service which plays a significant role in this high-technological world, where each and every firm try their best to be on the top of the list of consumers’ preference. For this purpose, companies reduce manual efforts by converting it into qualitative software that provides deliverables in a systematic manner. Therefore, it becomes imperative to study various interrelated quality attributes of the software. On the similar platform, ISM is a widely used technique and just to provide a helping hand in quantification of the qualitative attributes this paper facilitates the readers with algorithm developed using R software.

Details

Journal of Advances in Management Research, vol. 14 no. 3
Type: Research Article
ISSN: 0972-7981

Keywords

Article
Publication date: 10 December 2018

Sunil Kumar Tiwari, Sarang Pande, Santosh M. Bobade and Santosh Kumar

The purpose of this paper is to propose and develop PA2200-based composite powder containing 0-15 Wt.% magnesium oxide before directly using it in selective laser sintering (SLS…

Abstract

Purpose

The purpose of this paper is to propose and develop PA2200-based composite powder containing 0-15 Wt.% magnesium oxide before directly using it in selective laser sintering (SLS) machine to produce end-use products for low-volume production in the engineering applications with keen focus to meet the functional requirements which rely on material properties.

Design/methodology/approach

The methodology reported emphasises PA2200-based composite powder containing 0-15 Wt.% magnesium oxide development for SLS process which starts with preparation and characterisation of composite material, thermal and rheological study of composite material to decide optimum process parameters for SLS process machine to get optimal part properties. Further, to verify composite material properties, a conventional casting methodology is used. The composition of composite materials those possessing good properties are further selected for processing in SLS process under optimal processing parameters.

Findings

The process parameters of SLS machine are material-dependent. The effect of temperature in X-ray diffraction profile is negligible in the case of magnesium oxide reinforced PA2200 composite material. The cyclic heating of material increases melting point temperature, this grounds to modify part bed temperature of material every time before processing on SLS machine to uphold build part properties, as well as material. With the rise in temperature, the Melt flow index and rheological property of materials change. The magnesium oxide reinforced PA2200 composite material has high thermal stability than pure PA2200 material. By the addition of small quantity of magnesium oxide, most of the mechanical property and flammability property improves while elongation at break (percentage) decreases significantly.

Practical implications

The proposed PA2200-based composite powder containing 0-15 Wt.% magnesium oxide material development system and casting metrology to verify developed material properties will be very useful to develop new composite material for SLS process with use of less material. The developed methodology has proven, especially in the case where non-experts or student need to develop composite material for SLS process according to the property requirement of applications.

Originality/value

Unlike earlier composite material development methodology, the projected methodology of polymer-based composite material and confirmation of material properties instead of commencing SLS process provides straight forward means for SLS process composite materials development with less use of the material and period of time.

Details

Rapid Prototyping Journal, vol. 25 no. 1
Type: Research Article
ISSN: 1355-2546

Keywords

Article
Publication date: 1 November 2021

Pradeep Kumar Tarei and Santosh Kumar

This paper proposes a decision-making framework for assessing various dimensions and barriers that have affected the admission process in management educational institutions…

Abstract

Purpose

This paper proposes a decision-making framework for assessing various dimensions and barriers that have affected the admission process in management educational institutions during the ongoing pandemic. The framework considers the interrelationship between the obstacles and highlights the importance of each barrier.

Design/methodology/approach

An integrated method based on decision-making trial and evaluation laboratory and analytical network process is proposed to structure the barrier assessment framework. Results obtained from the study are validated by comparing them against the conventional analytical hierarchy process.

Findings

The results obtained from this study indicate four significant dimensions that hinder admission in Indian management institutes, namely, governmental, financial, sectoral, institutional and market. The top five barriers are demand shift towards technical (alternative) skills, acceptance of the graduated students, lack of industry–institute collaboration, lack of long-term vision and opening new Indian Institute of Technologies (IITs) and Indian Institute of Managements (IIMs).

Research limitations/implications

During this ongoing pandemic, many educational institutes have been forced to shift from the traditional classroom to a virtual teaching model. In this regard, this study helps identify and assess the barriers to admission in Indian management institutes during this epidemic and thus, contribute to the literature. The findings will assist all stakeholders and policymakers of management institutions design and develop appropriate managerial strategies. The study is conducted in the Indian management educational institute context and can be extended to technical education institutions for deeper insights.

Originality/value

The paper develops an assessment framework for analysing the barriers to admission in Indian management institutes during the ongoing COVID-19 pandemic. Research implications are discussed in the context of a developing country.

Details

Benchmarking: An International Journal, vol. 29 no. 7
Type: Research Article
ISSN: 1463-5771

Keywords

Article
Publication date: 21 March 2023

Rajat Kumar, Mahesh Kumar Gupta, Santosh Kumar Rai and Vinay Panwar

The changes in tensile behavior of polycrystalline nanocopper lattice with changes in temperature, average grain size (AGS) and strain rate, have been explored. The existence of a…

Abstract

Purpose

The changes in tensile behavior of polycrystalline nanocopper lattice with changes in temperature, average grain size (AGS) and strain rate, have been explored. The existence of a critical AGS has also been observed which shows that the Hall–Petch relationship behaves inversely.

Design/methodology/approach

Nanoscale deformation of polycrystalline nanocopper has been done in this study with the help of an embedded atom method (EAM) potential. Voronoi construction method has been employed for creating four polycrystals of nanocopper with different sizes. Statistical analysis has been used to examine the observations with emphasis on the polycrystal size effect on melting point temperature.

Findings

The study has found that the key stress values (i.e. elastic modulus, yield stress and ultimate tensile stress) are significantly influenced by the considered parameters. The increase in strain rate is observed to have an increasing impact on mechanical properties, whereas the increase in temperature degrades the mechanical properties. In-depth analysis of the deformation mechanism has been studied to deliver real-time visualization of grain boundary motion.

Originality/value

This study provides the relationship between required grain size variations for consecutive possible variations in mechanical properties and may help to reduce the trial processes in the synthesis of polycrystalline copper based on different temperatures and strain rates.

Details

Multidiscipline Modeling in Materials and Structures, vol. 19 no. 3
Type: Research Article
ISSN: 1573-6105

Keywords

1 – 10 of 180