Posts

Showing posts from October, 2017

Large SAP tables which are good candidates for partitioning

Image
Not all tables are good candidates for partitioning. The tool we have developed runs through multiple checks for large tables to identify the right candidate. Generally if the answer is yes to the following questions, table partitioning may be a suitable strategy for improving performance. 1. Is the database large enough? 2. Is the table large enough? 3. Do your reports and queries have distinguishable access patterns? 4. Are you experiencing slowness and performance degradation? 5. Is the skew factor for the data distribution under reasonable limits? Below are some of the large tables which have been candidates for table partitioning amongst our SAP customers. Large material master tables MARA                   Material Master Table MARC                   Plant Material Data MARO                   Company Material Data MAKT                    Material Text Table. MARCH                                Material Master C Segment: History MARD            

SAP HANA exploits partitioning in SAP BW for optimization

Image
Partitioning significantly improves query performance, based on the types of queries and on your hardware configuration as the system will find faster the requested information, because it will filter out all the irrelevant partitions and thus reduce the data volume to be read. This is a standard feature in most databases but SAP did not truly exploit this feature being database agnostic until HANA. In SAP/R3, SAP provided partition keys for BW but did not partition data during installation. With HANA the table partitioning/distribution is automated for SAP NetWeaver Business Warehouse. Database tables created for InfoCubes or classic DataStore objects are partitioned on the database. While the technical partitioning of data at database level helps optimizing the runtime of BW processes to a great extent, they alone are inadequate in building and maintaining data warehouses that scale easily with growing data volumes. Hence, to address some of the aspects of dealing with

Partitioning in SAP HANA database

SAP HANA provides two ways of distributing data 1. Database Partitioning - different tables being distributed across several hosts (SAP Note 2044468 - FAQ:SAP HANA Partitioning) 2. Table Partitioning - splitting a column store table and distributing across several hosts (SAP Note 2081591 - FAQ: SAP HANA Table Distribution)  SAP HANA supports: 1. Hash Partitioning 2. Range Partitioning 3. Round-robin Partitioning In SAP HANA, we can create 1. Single-Level Partitioning 2. Multi-Level Partitioning The optimizing algorithm takes factors into account such as: - Balance tables equally in a way that they are spread over all the slave nodes and make optimal use of the available physical memory on each node - Ensure that tables/table partitions which are belonging to the same table groups (see tables _SYS_RT.TABLE_PLACEMENT, SYS.TABLE_GROUPS) are located on the same nodes, as they are logically frequently queried from application side via joins and if they

The challenge of AI and deep learning systems

One of the biggest challenges of AI is that it is not plug-n-play. You have to customize AI to your business context. A common pattern in AI is mapping an input to an output. For e.g. input a picture, output 1 if it is a cat, or 0 if it is a dog input an audio file, output transcript input Spanish text, output English text input customer logs, output 1 if customer will churn, or 0 if they will not churn input transaction logs, output 1 if transaction is fraud, or 0 if it is not The challenge and also the opportunity of AI is discovering what input and the corresponding output that will fit in your business context. Maybe your group writes a lot of SQL, perhaps the request for SQL comes as a ticket created in JIRA in plain english then the input in this case will be the english description of the SQL needed, along with the database schema, and the output will be SQL. Maybe you spend time deciding if an item should be retired from your catalog. Then the input will be ite

Crowdsourcing Sales Teams

Lead Generation in a Startup's Life Here is the typical journey of a startup. Founders tap into their LinkedIn networks. They pitch their product / vision and in the process discover a ton of things such as how to sell, what messaging is resonating, and who the target customer is, when the product is likely to garner interest, and what questions potential customers will have. As you can imagine this is a lot of figuring out, and there is a lot of trial and error. It does not take time before the founder exhaust their LinkedIn networks. It is also very rare that by this stage they have found product market fit. But the founders have to continue to generate a steady flow of leads. Rather the pressure is even higher because as time passes by the startup needs to show sales traction. However, companies can't hire a sales team. It is strongly recommended that they don't do so. Because if you hire a sales team prematurely you will sink your company. So here is a chicken egg pr

Using Deep learning to integrate range partitioning into SAP systems

Image
The principle of partitioning is very simple: It means splitting a large table in smaller parts, called partitions, grouping and separating the rows of the table based on the content of one or more columns. Rows fulfilling the same criteria are placed in the same partition. Partitioning can provide tremendous benefit to a wide variety of applications by improving performance, manageability, and availability. - By limiting the amount of data to be examined or operated on, and by providing data distribution for parallel execution, partitioning provides multiple performance benefits. - Partitioning enables you to partition tables and indexes into smaller, more manageable units, providing database administrators with the ability to pursue a divide and conquer approach to data management. - Partitioned database objects provide partition independence. This characteristic of partition independence can be an important part of a high-availability strategy. For example, if one part

A unique application of deep learning in handling customer complaints and questions

Almost every site requests customers to submit their requests via a form. Most businesses can't afford a call center. Even the ones that do obscure their numbers because they want to minimize the number of calls and hence human operators. Which means that contact us forms are here to stay. However consider the default experience of contact us forms. The user sends a message "I want to know if you do a free pilot of your service?" Or maybe they write "how can I get access to my tax documents for 2015?". The standard email response is that "we got your message and we are looking into this". But can we do better? AI especially deep learning has been shown to have some success in automated email replies. Imagine applying the same principle to contact us or report an issue form. To train such a system you can simply feed in all the previous responses to all the previous questions asked. The deep learning system can then for new requests auto generate an

How deep learning and AI can revolutionize depression and anxiety

It's hard to imagine how AI could help in therapy. How can AI help in reducing depression and anxiety? We at SublimeAI have talked to various therapist, physiatrists, and counselors. One trend emerging in is remote therapy. Where the therapist is at a remote location and conducts therapy using video chat like skype, WhatsApp, or google hangouts. Many therapist insist that the session be video based because 90% of human communication is non verbal. So how can deep learning improve the therapy experience? Deep learning is able to recognize human emotions and if trained well it can identify subtle emotions. These emotions can then be monitored and catalogued. So a therapist can then review at what points in their conversation the emotions showed anger, at what point they showed sadness, and when did they show happiness. This can significantly enhance the therapist in identifying what the trigger points are and then focus on fixing those. Deep learning can also be used to cla