by EDMS Consultants Sdn Bhd | Jun 30, 2021 | M-Files, Uncategorized
M-Files is launching a new online user community on March 15. In the weeks prior to launch, it’s a good time to take a step back and consider online user communities in general. What do they do? What needs do they serve? And why are they so important for SaaS success?
It used to be that technology adoption was viewed as a relatively simple transaction — with a clear start and end.
Decide what software you need.
Buy it.
Learn how to use it.
Done.
Today’s SaaS providers understand what their predecessors did not — that adopting new technology is a process, not a transaction, and there are many varying stages along the customer journey. At every stage, a customer has distinct needs and questions.
Enter communities. Online user communities serve customers by providing a clear place for customers to ask questions, get resources, and provide input, regardless of where they are in their journey. How do communities do that?
Communities Connect People.
Communities bring together people with shared interests, values, and needs (and they have done that since well before the “online” part was even invented). In an online community, a SaaS customer can share ideas and best practices — not just with SaaS employees and experts, but with other customers, customers who face the exact same challenges that they do.
“You encountered that? So did WE, and here’s how we handled it.”
Communities Consolidate Resources.
Most SaaS companies offer extensive resources for customer and administrator training and enablement — user training, administrator training, quick tips. But from a customer perspective, it can be hard to keep track of what’s available, and even if you remember which resource you need, it can be even harder to remember where that resource lives. Communities solve this problem by providing one place from which all relevant resources can be linked and accessed.
Communities Fill the Knowledge Gaps.
Even with plentiful training resources, it’s easy for a customer to have a question that doesn’t quite fit under a nice, tidy heading — leaving the customer with not only the question itself, but the burden of having to figure out where to ask it. Is this a support question or a settings question? Or is it a procedure question that doesn’t actually touch on the technology at all? Communities provide a place for SaaS customers to ask any question, even if they’re not sure whom to ask or what the underlying factors are.
Communities Give Customers a Voice.
SaaS providers know their own software inside and out, but it’s the customers who understand what it’s like to use that software in real time. Customers have extraordinary insight into new use cases and problems. An online community gives customers a chance to share that insight and offer ideas for how the software might develop.
Our entire team is excited for the launch of the M-Files Community on March 15. The purpose is to bring together customers, admins, partners and M-Files employees to discuss all things M-Files. Whether it’s a technical question, best practices or just a nickel’s worth of free advice, the M-Files Community will:
- Establish a sense of connection within the global M-Files community. With thousands of customers and many more individual users, the community is the one-stop destination connecting every one of us that share a passion for M-Files intelligent information management.
- Facilitate self-service. The community is a place where users can get answers quickly from the community without the need to log a formal support ticket (although our Customer Success advocates will still be available for your support needs)
- Help us gather feedback from M-Files users about the product. In our customer-centric business, feedback is a gift, and we want to hear from you. The community is an avenue for users and admins to provide feedback, which will inform future product roadmaps.
Source: https://resources.m-files.com/blog/4-reasons-saas-users-love-online-user-communities-and-why-youll-love-the-m-files-community
by EDMS Consultants Sdn Bhd | Jun 28, 2021 | business process management, Newgen, Robotic Process Automation Solution, RPA, Uncategorized
Hyperautomate with Process Insights and Artificial Intelligence for Efficient Processes
In the first blog of this series, I had shared one half of the hyperautomation journey—how RPA and BPM can work in harmony to automate incredibly complex business tasks.
In this one, let’s delve into the second half—how other complementary technologies, such as process insights and artificial intelligence (AI)—are crucial parts of hyperautomation, as they enable rapid, end-to-end business process automation and accelerate digital transformation.
Process Insights: Discover, Monitor, and Improve Workflows
Process insights are created by leveraging event logs, generated by enterprise systems including ERP, BPM, CRM, human capital management, and supply chain management, to rebuild a virtual view of your business processes. These insights are designed for you to discover, monitor, and improve real processes by extracting knowledge available within application systems.
Process mining is one of the multiple stages in the process automation lifecycle, which analyzes the extent to which RPA can be implemented across legacy systems. Furthermore, it enables monitoring and analysis of process performance for continuous improvement. Robust process mining tools can blend data mining with AI and machine learning (ML) to generate data-based analytics. This can help you explore the state of your business processes and identify new opportunities and bottlenecks for optimization and automation.
Artificial Intelligence: Automate Repetitive-to-Cognitive Processes the Smart Way
AI enables bots to intelligently perform tasks, such as reading, understanding,?and processing data, thus making it an essential ingredient of hyperautomation. Cognitive technologies, such as?ML,?natural language processing (NLP), optical character recognition?(OCR),?and?AI, integrate with RPA to increase process efficiency and accuracy. You should deploy these technologies in tandem to realize business value and deliver specific, measurable outcomes for targeted use cases.
Understanding Hyperautomation with a Use Case
Let’s suppose you are automating an anti-money laundering process and implementing a fraud detection algorithm. You may need to understand the interfaces between your AI components and other automation tools. Many of these processes involve non-routine tasks, intelligent decision making, and human judgment.
In this case, your system would execute the following steps:
- An intelligent business process management suite manages the decision-driven workflow/orchestration of your process
- It triggers an RPA bot to perform data collection, and other routine work, to validate your customer records
- The consolidated data is fed into the fraud detection algorithm, built on an ML model, to identify patterns. This process can sometimes involve human intervention, in case a formal approval or e-signature is required
- Subsequently, another RPA bot is triggered to perform follow-up actions and update transactional systems, such as ERP, CRM, and other applications
The Time for Action is Now!
Hyperautomation is the key to adapt to the ever-changing business environment and achieving unprecedented levels of quality and efficiency. RPA, BPM, and process insights, and AI will enable your organization to achieve scale and flexibility in operations and allow your employees to focus on more value-added tasks.
Source: https://newgensoft.com/blog/hyperautomation-the-answer-to-your-automation-needs-part-2/
by EDMS Consultants Sdn Bhd | Jun 25, 2021 | Digital transformation, Document Management, M-Files, Uncategorized
Over the years, document management systems have evolved to the point where they’ve become a major contributing factor to the productivity of your organization… or they can be, under the best possible circumstances.
So how do you lean into the strengths that make these solutions so powerful, taking advantage of those “best circumstances” at every opportunity? By making seven little changes to your approach that add up to a big, big difference before you know it.
Create a Culture that Embraces New Software
By far, the most important thing you can do to make a big difference with regards to your document management system involves embracing new software when the situation calls for it. But don’t just drop a new solution into your employee’s laps. Start slow, schedule training sessions to make sure people actually know how to use it (something that will certainly increase adoption) and make yourself available to answer any questions or to address any concerns that your employees might have. Change management is key here and how your organization achieves it will be a key to adoption of your document management system.
Empower Collaboration
You should also make sure that you’re making collaboration a central focal point of all your document management efforts. It will have an almost immediate impact on content creation, for example, as the quality of the work your employees will be able to deliver will always be better if they can freely partner with one another during the process.
Likewise, employees should understand their role in helping to organize files, label documents correctly, delete duplicate files and more. All of this goes a long way towards turning your document management system into something that drives results.
Enable Better Communication
If you had to make a list of all the elements that are the cornerstone of any successful project, communication would undoubtedly be right at the top. One of the major reasons why projects fall apart usually has to do with a breakdown in communication — which is why your document management system is about to become invaluable to that end.
By making your document management system your central point for communication, you’re removing the need for people to interact with multiple applications just to get things done. Plus, you’ll be eliminating yet another potential data silo for information to get lost in — which is why this is one step you should take sooner rather than later.
This is one of those little best practices that far too many organizations overlook — having a project leader (otherwise known as a community administrator) to oversee your document management system moving forward.
For the best results, try to find someone who already has an idea of how to operate every aspect of the system. Not only will this help you continue to make sense of things, but it also gives your employees someone they can always turn to if they have any questions or concerns.
The Power of Versioning
Versioning is a terrific change to implement to your document management system because it allows people to make as many changes as they’d like to a file, all without overwriting the original content contained inside it.
At that point, you can always refer back to an older version of a document if you need to — something that is particularly important while editing. Plus, you’ll be able to see who changed a file, why and when — all of which can be helpful in the long term.
The Importance of Access Rights
Another important best practice you’ll want to implement involves defining access rights for every file and document in your system. Keep in mind that not every employee is going to need access to every last kilobyte of data in order to properly do their jobs.
Not only will access rights help avoid confusion by making sure that nobody can access or even edit a document if they don’t expressly need to, but you’ll also go a long way towards safeguarding those documents as well.
Tagging, Tagging, Tagging
Finally, one of the most important changes that you can implement to your document management system involves leaning into the importance of metadata and tagging — something that makes finding the critical data you’re looking for far, far easier than ever before.
At a minimum, you should eliminate all guesswork from the equation by training your employees how to tag files and other documents the right way at the moment of their creation. Not only can this save a significant amount of time, but it can also increase productivity as well.
As you can see, improving your document management system is less the product of anyone’s major move and is more about a series of smaller ones. But these small tweaks all add up to something far more powerful than they could be on their own — which is a very exciting position for you to be in.
source: https://resources.m-files.com/blog/these-seven-little-document-management-system-changes-can-make-a-big-big-difference
by EDMS Consultants Sdn Bhd | Jun 24, 2021 | driving digital, Nuix, nuix investigate, nuix solution consultants asia pacific, Nuix solution malaysia, Uncategorized
Since my early days of forensics, like data storage and available devices, data transfer cables were a growth area. To stock a competent digital forensics laboratory, you needed to have the cables and adapters to read all the devices you might find in the wild. These included IDE, the occasional RLL and about 100 different configurations of SCSI cables. Along with these cables, it was important to have the appropriate write blocking technology to enable proper preservation of digital evidence while duplicating it.
Times have naturally changed, as I discussed in part 1 of this series. As storage interfaces grew and changed, the type and number of these write blockers grew at the same time. The investigator needed to show up in the field, confident that no matter the size and configuration of a storage device, they had the equipment to properly interface with it and conduct analysis.
While the need to be prepared and competent has not diminished in the slightest, the sheer volume of digital data found at a given crime scene or under a search warrant has exploded, from a bunch of floppy disks and maybe a hard drive or two in the late 90s to multiple tens of terabytes or more in the 2020s. This dramatic increase in raw data has required the high-tech investigator to learn additional strategies to find key data on-site, possibly before performing full forensic analysis in a lab. Tools like Nuix Data Finder and Automatic Classification can be deployed in the field to find crucial items of digital evidence now, not 6-12 months from now when the laboratory backlog gets to your case.
THE DIFFERENCE IN DECADES
I mention ‘prepared and competent’ because it can’t be overstated that what was required in the 90s is darn near trivial when compared to the massive scope of the digital investigations field today.
In a nutshell, investigators in the 90s required knowledge of:
- Windows
- DOS
- Linux
- To a very minor extent, Macintosh/Apple.
The knowledge included how their file systems worked and the technical ability to analyze floppy disks and hard drives using:
While networking could be a factor in business investigations, most people using their computers at home dialed up to their service provider and the records were fairly easy to understand.
Fast forward to today and what investigators need to know dwarfs all past generations:
- Windows (multiple flavors)
- Linux
- OS/X
- iOS
- Android
- Storage
- SATA/SAS spinning disk
- SATA/SAS solid state disk
- IDE disks
- SCSI disks
- NVME disks
- M2.Sata disks
- Flash storage
- SD/Mini-SD/Micro-SD
- Compact Flash
- USB 2/3/C hard drives
- Wireless hard drives
- Home cloud drives
- Cloud storage
- Azure
- AWS
- A variety of smaller/foreign cloud services
- Connectivity
- IPv4 networking
- IPv6 networking
- Bluetooth
- Wi-Fi
- 3G/4G/5G
- Devices
- Digital cameras with and without network connectivity
- Tablets IOS/Android
- Raspberry PI
- Drones
- Internet of Things (IOT)
- Data centers
- Security
- Encryption – So many impacts on file storage and networking that it deserves its own novel
- Multi-factor authentication
This list goes on and on. It’s almost impossible to recognize the field of high technology investigations when comparing the decades of development and advancement. It’s hard to imagine how a modern investigator can even be moderately competent given the breadth of knowledge required.
After all this history, I’m sure many readers will have some of the same questions. I’ll try to answer what I know I’d be asking, but I encourage you to reach out if you have others that I don’t cover here!
How Can Our Team Cover The Breadth Of Knowledge You’ve Outlined Here?
Having the properly trained and experienced personnel assigned to the cases involving the skills they are most experienced in is vitally important. Given the amount of available information out there, it is inconceivable that there is a single person in any organization who is best able to handle every type of case.
It’s also important to have the appropriate technical and hardware resources on hand to address the challenge of each type of data (and the platform it lives on).
What’s The Key To Ensuring We Are Focusing On The Right Pieces Of Evidence?
The one constant in my high-tech investigations tenure is the ability to interact competently with all types of people. Learning to interview and interrogate where appropriate and paying close attention to the facts of a case, including environment, are crucial components to locating all the data types required in each scenario to perform a thorough examination.
Secondary to the staff’s personal competence and their ability to ask pertinent questions about the environment they are investigating, is having a deep bench in terms of hardware, software and intelligence that will guide them to all available sources of digital evidence. Further, by having the knowledge and experience to learn all about the environment under investigation, the entire staff will be deeply steeped in the art of triage. This enables them to focus on most-likely-important evidence first and widen the scope needed to obtain all the facts without crushing themselves under the weight of trying to analyze ALL.
This is a slam dunk. Nuix Workstation gives me the single pane of glass to all the evidence types I’m interested in, while Nuix Investigate® allows me to present all the evidence I’ve collected and processed to support staff and case agents, who will perform the detailed review of documents and communications to determine their relevance to the case.
How Do We Fill In The Gaps?
Again, I’ve got the core of most of my needs in the Nuix suite of tools. Where Nuix does not have a solution, like threat intelligence feeds or cooperative intelligence like the ISACS, I can incorporate information from those feeds directly into my Nuix cases and correlate across all the available data to solve the questions posed by the investigation.
EMPOWERING THE MODERN-DAY INVESTIGATOR
We know investigations take on many different forms depending on where you work. While criminal investigations will differ in some ways from, for example, a corporate environment, many of the details remain the same.
I encourage you to visit the Solutions section of our website and see for yourself how Nuix helps investigators in government, corporations, law enforcement, and more.
source: https://www.nuix.com/blog/state-contemporary-digital-investigations-part-2
by EDMS Consultants Sdn Bhd | Jun 21, 2021 | driving digital, Nuix, nuix investigate, Nuix solution malaysia, Uncategorized
Digital investigations have undergone a geometric progression of complexity since my first fledgling technology investigations during the 90s. In those early years, a competent digital forensics professional only needed to know how to secure, acquire and analyze the floppy disks and miniscule hard drives that represented 99% of data sources at the time.
Since those halcyon days of Norton Disk Edit for deleted file recovery and text searching, there has been a veritable explosion of methods and places to store data. The initial challenges were focused mainly on training the investigators in a new field and the progression in size of available storage for consumers (and therefore investigative targets). While seizing thousands of floppy disks required immense effort to secure, duplicate and analyze, it was still the same data we were used to, just inconveniently stored and frequently requiring assistance from outside resources (thank you Pocatello, Idaho lab).
Information evolution and explosion has a direct impact on the field of investigations. To set the stage for the second half of this two-part investigations blog, in this article I’d like to look back on some of what I feel are the major changes that have occurred over the past 30-odd years.
LET’S CONTINUE OUR TOUR
By the turn of the century, hard drives, initially as small as 10-20 Mb, grew to a ‘staggering’ 10 Gb in a high-end computer. Flash media in the form of thumb drives and compact flash cards began to hit the market around the same time, becoming quickly adopted as the preferred storage medium for the newly minted digital cameras and tablet computers. Some of this media was small enough to be hidden in books, envelopes and change jars.
Cellular telephones, originally used only for voice communications, quickly advanced to transmit and store data in the form of messages, pictures and even email. As data became more portable, and therefore easier to lose or have stolen, encryption schemes arose that enabled normal consumers to adopt data security strategies that had previously only been used by governments and their spy agencies.
As data speeds increased, so too did the volume of data created and transmitted, necessitating the need for even more novel methods of storage. At about this time, the global adoption of remote computing quickly moved from dial up network services like AOL and CompuServe, to using those services as an entrance ramp of sorts to the internet, to direct internet connections of increased speed that eliminated the need for the AOLs of the world in the context in which they were originally operating, becoming instead a content destination for users connecting to the internet using rapidly growing broadband access.
FOLLOW THE DATA
Each step in this transformation required that the investigators learned the new ways that data moved, was stored and by whom. Just learning who an AOL screen name belonged to required numerous acquisitions and legal action. Compelling service and content providers alike to divulge these small pieces of data was required to determine where connections were being made from and sometimes by whom. High-tech investigators became one of many pieces of the dot com phenomenon.
Data protection services sprung up with the various dot com enterprises; securing data frequently involved transmitting backup data to remote servers. These servers were rented or given away to anyone who wanted them, adding to the complexity of identifying where in the world a given user’s data resided. After determining where the data resided, there were at least another two layers of complexity for the investigator – namely knowing what legal process was required to acquire the remote data and proving who placed the data on the remote servers.
As data quantity exploded, the need for more advanced software to analyze this data was quite pressing. There were several software offerings that sprang up in the early days that, unlike disk edit, were created for the express purpose of reviewing quantities of digital evidence in a manner that was forensically sound. Most early digital forensic tools were expensive, complicated and slow, but they represented an important step in the growing field of digital forensics. The early offerings of both corporate and open-source digital forensic software were anemic compared to today’s digital processing giants.
In some instances, the introduction of 100,000 files was sufficient to bring some tools to their knees, necessitating that forensic cases be analyzed in batches of evidence to avoid taxing the software. Thankfully, this is largely a thing of the past, as products like Nuix Workstation will chew through ten million items without a hiccup, much less a major crash.
Before we knew it, we weren’t just analyzing static data sitting on a local storage device. Network data investigation had to be added to the investigator’s arsenal to determine how data moved across networks, from where and by whom. Along with remote storage services, online communication services exploded across the internet, and suddenly the high-tech criminal had acquired ready access to victims from the very young to the very old for a variety of crimes.
This drastic shift to remote, anonymous communication represented a very new and very real threat that had the added complexity of making not only the criminals difficult to identify, but their victims as well. The traditional transaction involving a citizen walking through the entrance of a police station to report a crime still happened, but new internet crimes meant that when criminals were caught, it was no longer the conclusion of a long investigation. Frequently, it represented the beginning of trying to identify and locate the many victims who either didn’t know where or how to report the crime. This is all because the crimes were facilitated by, or the evidence recorded on, the growing catalog of digital storage.
DEVICES TOO
As digital communication grew, so did the devices used to facilitate it. Cellular phones made the steady shift from plain telephones to a new category referred to commonly as ‘feature phones.’ These phones incorporated digital messaging utilities, including instant messaging, mobile email and access to portions of the internet through basic web browsers.
With the proliferation of feature phones, the real need for mobile device analysis sprang into existence almost overnight. Text messages on a flip phone were easy to photograph and catalog, but feature phones had a much more unique interface, requiring investigators to seek out technical solutions to the problem of megabytes of evidence locked in a device that was as non-standard as you could get.
For each manufacturer of cellular devices, there was a different operating system, storage capability and feature set. None of the existing computer forensic tools could acquire or analyze the wide assortment of available handsets. The cherry on the top of these early ‘smart’ phones was the seemingly random shape, size, placement and pin structure of the cables used to charge them. Many phone models came with dedicated companion software for the home computer that enabled backup or access from the computer.
Those same unique charging cables became unique data transfer cables connected to unique software on the host computer system. It was at this time that the first cellular forensic tools appeared. These systems didn’t appear at all like modern cellular forensic tools. They required extra software, hardware devices called ‘twister boxes’ and a literal suitcase of data transfer cables. Much like the early days of digital disk forensics, cellular forensics was a laborious and highly technical enterprise that required a great deal of training and experience to pull off.
Everything changed again in June 2007 with the release of what many consider to be the first true smartphone: the iPhone. Not long after, the beta Android device was introduced in November 2007 and the cellular arms race was on. If data quantity and location was an issue before, it was soon to become immensely more serious as the public rapidly adopted the smartphone and began carrying essentially an always connected, powerful computer in their pockets and purses.
If the high-tech investigation world was difficult before, it was about to become immensely more so. About the only beneficial thing that smartphones did for investigators was, over a 6-8 year period, they killed the feature phone and with it the suitcase of unique cables. A top shelf cellular forensic professional can safely carry five cables with them to handle the vast majority of phones in use. The original iPhone plug is still found in the wild, the newer Apple Lightning cable, and each of the USB flavors, mini, micro, and USB-C.
But, as you’ll see in part two of this series, that’s about the only positive for investigators. Things have continued to get much more complicated.
Source: https://www.nuix.com/blog/state-contemporary-digital-investigations-part-1
Recent Comments