Tuesday, July 26, 2016

XSS solution using blacklisting

What is black listing approach – Define untrusted scripting character and encode(or remove) them from actual request Challenges
  • Many HTML features that allow scripting (e.x some part of page is generated by backend)
  • Application may have custom extension to HTML
 Solution
  • Define servlet filter (XSSFilter) – Filter will return sanitized Http Request Wrapper
  • Define Sanitized HTTP request Wrapper – Override every method and return the clean value by applying the sanitization
  • Define Sanitized rule – rules are nothing but configurable untrusted character per page per parameter
  • Define Sanitization (This will clean unsafe character)
XSS

Cross Site Forgery Attack Prevention

What is CSRF ?

Cross-Site Request Forgery (CSRF) is a type of attack that occurs when a malicious web site, email, blog, instant message, or program causes a user’s web browser to perform an unwanted action on a trusted site for which the user is currently authenticated, CSRF exploits the trust that a site has in a user's browser: Reference
Below are several thing the attacker look for to execute the CSRF attack
  • Check a site that doesn't validate "Referer header" (which is common).
  • Find a form submission at the target site, or a URL that has side effects, that does something (e.g., transfers money, or changes the victim's e-mail address or password).
  • Lure the victim to a Web page with malicious code while the victim is logged in to the target site.

Approach for preventing CSRF

Web sites have various CSRF countermeasures available:
  • Checking the HTTP Referrer header for same domain name as current url;
  • Limiting the lifetime of authentication cookies
  • CSRF Token: Embedding additional authentication data into requests that allows the web application to detect requests from unauthorized locations.

Token based Approch

Embedding additional authentication data into requests that allows the web application to detect requests from unauthorized locations.
  • Generate the token for new session
  • Generate the hidden field for all form having token from session, this is the token in request
  • Compare the token from request and session in servlet filter to ensue they match, if the match is not found then the form is posted by attacker from unknown source, redirect them to login page
Generate the Token in put that in Session
If the session is new, generate the random token and encrypt it put that in the session

if (newSession) {
....
String encryptedCSRFKey=encrypted(UUID.randomUUID());
 request.getSession().setAttribute(CSRF_TOKEN,encryptedCSRFKey);
}
Generate the HTML field with the same token
2 approaches
  • Create taglib for <Form> element which will automatically inserts/generate a hidden html element with name "CSRFAuthKey"
  • Manually creating hidden html element with name "CSRFAuthKey". Sample code snippet for populating this is shown below
End goal is hidden field needs to be generated with token key as present in session
<input type="hidden" name="CSRF_TOKEN" property="<bean:write name="CSRF_TOKEN" scope="session" filter="true"/>"/>
Filter to authenticate extra token
Create the filter(servlet filter) which intercept every request and compare the posted token with session token

if (session == null) {
chain.doFilter(request, response);
return;
} else {
// validate the CSRF
String sessionToken = getSession().getAttribute("CSRF_TOKEN").toString();
String requestToken = httprequest.getParameter("CSRF_TOKEN");
if (sessionToken.equals(sessionToken)) {
chain.doFilter(request, response);
} else {
CommonUtils.updateSessionToken(session);
httpresponse.sendRedirect("/errorPage.jsp");
}
}

Tuesday, June 21, 2016

Implementing Continuous Delivery

This is continuous of previous  post "Continuous deliver - Introduction"
When we think of Continuous Delivery, mostly we think a lot about the tools which is important  however, the real challenge come because of existing design of your system specially when it is monolithic architecture, which slow down the developer and de-motivate them to do re-factoring for project betterment.
Here are few of many challenge with monolithic architecture
1. Build: Everything is configure to run under one build task which slow down the build and application start-up time.
2. Test Case: Test suites tend to grow too large, making the feedback loop slow which is a kind of an enemy instead of a companion.
3. Single Project for all code: This compound above two problem, also  result into less ownership when multi-functional team work on such project.
4. Management: Hard to manage technical debt.
Here are some step toward resolving those challenges.

Convert the project into multi-module project

Split the monolithic application into multi-module, you may not be able to move the source code quickly , however create the module and move the test cases into respective module should be easier and less risky, then create the separate jenkins pipeline for each module.
Advantage
1. Each module has its of development pipeline
2. Focused Ownership as each functional team will be responsible for its own module
3. Quick feedback as module specific test cases are executing instead of large test suite
Tools
Gradle: Customize the build script to create the multiple module project, gradle also has a capability to allow the parallel task execution
Project Structure
 (root)Project\build.gradle
 (Module1)Project\Module\Module1\build.gradle
 (module2)Project\Module\Module2\build.gradle

Categorize the test cases

Small/Unit test: 
These are method level unit test case  which runs in memory and in isolation without any dependency, below are advantage of these test cases
1. They runs real fast, so should be triggered for each commit
2. They are primarily useful for developer to understand the code and design , it motivate the developer to refector the code  as there is test case to validate it.
3. Helps to collaborate well among the developer
Medium/Component test/
Module level test case, test each component in flow
1. This should also run quickly, important test case should run as a part of development pipeline, all test case should run as a part of master pileline (before reaching to QA)
2. No mock for component in flow
3. Mock other modules/external dependency
4. They are primarily useful for business to ensure business test case are green and covered
Large test/Integration test/
Application level test case which test entire flow(excluding external dependency).
1. They are slow in nature, they should run as a part of qa pipeline.
2. No mock for any component in flow
3. Mock or test account for external dependency
4. Primary useful to test the integrity of the system
Source folders
src/test – unit tests
 src/test-component –  Component tests
 src/test-integration – integration tests
practical challenge: If the existing project does not have enough unit test case and organization may not invest on unit test case as it does not add any business value, possible solution could be to convince the management to invest on component/medium test as it validate the business and reduce the regression testing effort
Tool
Gradle: Use to categorize the test case, configure each catagory as source folder and run each test category as separate build step
TestNG: Use testNG to execute the test case in parallel.
Selenium: To run UI/component  test

Code coverage

Code coverage is a way of ensuring that tests are actually testing the code. It helps ensuring the quality of tests, not the quality of the product under test.
Why code coverage
Testing by themselves do not provide enough confidence unless we know that they cover significant code coverage. Having all tests successful while, for example, covering only 15% of the code cannot provide enough trust.
In general code coverage is tie to unit tests but it should be used with any type of testing including integration/functional/manual testing
Tool
Jacoco: can be used to capture the back-end code coverage, it has two step first- capture the coverage, second:- decompile the coverage data and generate the html
Istanbul: can be used to capture the javascript code coverage.
Gradle: gradle has inbuild plug-in support for jacoco
Jacoco can be configured as agent jar to server so that it can capture the data for manual testing as well.

Provision Deployment environment

Use Docker to provisioning environment which is recreational  with every build, web server (tomcat ect) /db/ selenium hub-node everything should run inside its own container with linked each other
Advantage
1.Tear down and create the environment with every build, Infrastructure as code.
2.Application is deployed and bootstrap in isolation and in reproducible environment
3.Docker container are version controlled
4.Developer push the image and Jenkins pulls the container in different environment
CD_Environment_Deployment

Pipeline as Code

1.Jenkins is used as CI tool
2.Jobs maintenance is pain
3.programmatic creation of jobs (job ds and, Pipeline plugin)
Tool : Jenkin DSL is groovy based scripting language which support to generate the jenkins job.

GeneratedJob

Tuesday, May 3, 2016

Continuous Deliver - Introduction

What is CD?


Every commit made to the workspace should be a release candidate for production and it should be deployable to production using one push button. Release is still a manual process and require an explicit push of a button

Why CD?

  1. Eric S. Raymond: Release early. Release often. And listen to your customers.
  2. Steve jobs: You can't just ask customers what they want and then try to give that to them. By the time you get it built, they'll want something new.

Problem: Many organizations, release cycle are measured in week or months and release process is certainly not repeatable or reliable because it is manual and often requires a team of people to deploy the software into testing/staging /production. 

Why release cycle is risky and long?
Most release process included lots of thing as mentioned below, where any once of step may go wrong.

  1. Individually crated environment by IS team
  2. Software installation that the application relies on
  3. Configuration is copied/created thought different process
  4. Reference/Master data are created/copied
  5. Manual patch executed on production need to be  recorded.


Solution: Continuous delivery practices will enable you to make your release fast, repeatable and reliable, visible though automated process with well-understood, quantifiable risks,  goal is to deliver useful, working software to user as quick as possible.


 Benefits :

  1. Anyone from any team would be able to deploy the production(without special knowledge)
  2. Production  is always production ready throughout its entire lifecycle  
  3. Developers aren't done with a feature  is delivered to QA or when the feature is "QA passed". They are done when it is working in production(production like environment)

Step toward Continuous delivery

Prerequisites

  1. Version control
  2. Builds System
  3. Deployments
  4. QA Automation


Key component of Continuous Delivery

  1. Configuration Management(CM)
  2. Continuous Integration(CI)/Automated build
  3. Continuous Deployment/Automated deployment
  4. Continuous Testing/Automated Testing

Configuration Management

Configuration management (CM) is the detailed recording and updating of information that describes  application and dependent software, Typically includes 
  1. Versions 
  2. Updates that have been applied to installed software packages 
  3. Locations and network addresses of hardware devices.

Type of CM dependency 
  1. Internal dependency management (Library)
  2. External dependency management(Software/Hardware/OS )
  3. Environmental dependency management(QA/Staging/Production)
Type of configuration
  1. Build time
  2. Packaging time
  3. Deployment time
  4. Startup time
Note: Do not pass the configuration at build time or packaging time, it should be setup at deployment time or statup time thought Environment variable.

Tool need to manage the version control
  1. Version control (git) : Keep everything in version control including internal/external/environmental dependency
  2. Dependency management tool (Gradle/maven): Use this to manage internal application dependency 
  3. Docker/ vagrant /virtualization/Puppet/Chef:  Use this for external dependency management
  4. RDBMS, LDAP, file system, webservice: use any of this to store environmental configuration which should pick the configuration based on environment name.

Continuous Integration

(CI) is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early.

Build Tool: Ant, maven, Gradle, make 
CI Tool: Jenkins, Cruise control, Go, TeamCity, Bemboo

Continuous Integration should include: Unit Test, Component Test, and Acceptance Test

CI mainly focuses on development teams. The output of the CI system normally forms the input to the manual testing process and hence to the rest of the release process. Much of the waste in releasing comes from the progress of testing and operations

CI is not enough as 
  1. Build and ops team is waiting for document for deployment or fix
  2. Tester are awaiting for good build
  3. Dev team receive the bug report a week after the team has moved to new feature.
  4. Discovering toward the end of deployment that something is not compatible.

Continuous Deployment

Continuous Deployment pipeline is an automated process for releasing product from version control to production by any user in the organization

Reference:



Continuous Testing

Write the automated tests at multiple levels (Unit, component. Acceptance and regression) and run them as a part of deployment pipeline, ensure it has great coverage, CI and CD is only possible if the test case has great coverage.  This brings the quality to projects.

Type of testing
  1. Functional Testing (Automated) (QA test)
  2. Unit testing(Automated) (Dev test)
  3. Showcase Usability Testing/Exploratory Testing(Manual) (QA Test)
  4. Nonfunctional acceptance testing(Manual/Automated) (QA test)

Best Practices

  1. Treat your Build script and  configuration same as your code, they should be tested and refactored so that they are always tidy and understandable
  2. Continuous Delivery is a practice, not a tool. It requires a degree of commitment and discipline from teams.. so You need everyone to check in small incremental changes frequently to mainline and agree that the highest priority is to fix the broken build
  3. Follow test driven development as it drive your application design and documentation
  4. Use human readable test and suite.
  5. Write the acceptance criteria and automate the acceptance criteria.

Implementing continues delivery.

Required Tools
  1. Docker is container technology that allow you to package an application with all its dependencies into a standardized unit for software development, In CD it is used to setup the build configuration management(CM)
  2. Gradle is build automation system created upon the concept of Ant + maven + Groovy based DSL instead of XML
  3. Jenkins is integration tool which provide the continuous integration service for development

Build you configuration 

Build installation dependency using Docker
  1. Declare OS/software dependency using Dockerfile, e.g to test the web application using selenium,  we need three docker file for each environment with installation command
  2. Build the image: Execute the Dockerfile and create the image 
  3. Link images based on dependency and create the build environment, Apps dependes on Database
Note: With Docker in place, the software installation will be repeatable and reliable


Build Project library  dependency
  1. Use gradle dependency management tool to define the internal dependenc
Build Script
  1. Use gradle and created the build script to 
    • Compile the code
    • Created the docker environment
    • Run the test for  those environment
    • Destroy the environment (kill the docker container)
  2. Use gradle to generate the various quality and health report 

CI setup

Setup the Jenkins to execute the gradle task which will create the docker environment to run the product.



Continuous deployment pipeline



Tuesday, April 26, 2016

Issue after updating to ubuntu 16.04 (GIT, Docker )

I am always eager to be on latest version so I updated ubuntu 16.04, soon I started facing the unexpected error in my application

DSA based SSH not working in git

I tried to pull my code from git repository, the error I got was Permission denied, After spading almost a full day I get to know latest OpenSSH 7.0 has disabled the support DSA based SSH key used in ubuntu 16.04 : reference

$ git pull origin master     
  
Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists

Solution
you can re-enable support locally by updating your ~/.ssh/config files with lines like so: 
 
Host *
PubkeyAcceptedKeyTypes=+ssh-dss
 
 
  

 Docker not working

After upgrading to ubuntu 16.04, my docker container started failing with various error, but most of them had one error messages common at the end as below 

Resource temporarily unavailable

Primary error occurred because  systemd has been updated in new kernal 4.3 to limit the number of process or thread


There's a new system.conf setting DefaultTasksMax= to control the default TasksMax= setting for services and scopes running on the system. (TasksMax= is the primary setting that exposes the "pids" cgroup controller on systemd and was introduced in the previous systemd release.) The setting now defaults to 512, which means services that are not explicitly configured otherwise will only be able to create 512 processes or threads at maximum,


Run below command and notice Tasks: 502 (limit: 512) In output
 systemctl status docker



Solution 

Update the TasksMax limit  to infinity and verify as below as below
  1. Set TasksMax=infinity in the [Service] section of docker.service at /lib/systemd/system/docker.servic
  2. Run 'systemctl daemon-reloads' to reload units changed

Wednesday, January 25, 2012

Annotation-Based Spring Application context




Our goal is to get rid of manual wiring in the configuration files (Java or XML)

Annotation based application context

Define configuration bean as application context 

Define configuration java class having @Configuration, this java class is called Annotation based application context
1.                     @Configuration
     public class ApplicationConfig{
   
    }

    Register you bean


  • Register your java bean using@ComponentScan, This will auto-detect your classes as spring bean in package com.sudhir 
  @Configuration
     @ComponentScan(basePackages="com.sudhir", excludeFilters {@ComponentScan.Filter(Configuration.class)})
  public class ApplicationConfig{
   
}

There are 4 steriotype annotation
@Component Indicates that an annotated class is a "component".
@Repository Indicates that an annotated class is a "Repository" (or "DAO").
@Service Indicates that an annotated class is a "Service" (e.g. a business service facade).
@Controller Indicates that an annotated class is a "Controller" (e.g. a web controller).

Here is an example of DAO classes 


@Repository
public class BeerDaoImpl implements BeerDao{
}

Inject the bean as dependency

Inject the required bean using @autowired , example bean dataSource is injected using autowiring

@Repository
public class BeerDaoImpl implements BeerDao{
            JdbcTemplate jdbcTemplate;

            @Autowired
            public void setDataSource(DataSource dataSource) {
                        this.jdbcTemplate = new JdbcTemplate(dataSource);
            }        
}

Register third party bean

   DataSource is not your bean how do you inject the data source bean?, what if the bean are third party classes, we need to define such bean in configuration file, configuration for datasource.
a.       We don’t want to hard code the properties in datasource and want to use the properties file
b.      Add  @PropertySource annotation to get the properties files values in Environment class
c.       Inject Environment class using autowired
Java based configuration file for data source


@Configuration
@ComponentScan(basePackages = "cybage", excludeFilters = {@ComponentScan.Filter(Configuration.class)})
@PropertySource("classpath:jdbc.properties")
public class ApplicationConfig{
   
            @Autowired
    Environment env;
           
            @Bean
            public DataSource dataSource() {
                        com.mchange.v2.c3p0.ComboPooledDataSource dataSource = new com.mchange.v2.c3p0.ComboPooledDataSource();
                        try {
                                    dataSource.setDriverClass(env.getProperty("driverClass"));
                        } catch (PropertyVetoException e) {
                                    // TODO Auto-generated catch block
                                    e.printStackTrace();
                        }
                        dataSource.setJdbcUrl(env.getProperty("jdbcUrl"));
                        dataSource.setUser(env.getProperty("user"));
                        dataSource.setPassword(env.getProperty("password"));
                        return dataSource;
            }

}


Write the test class

Because your context is AnnotationConfigAplicationContext Loader should be AnnotationConfigContextLoader.class And Configuratin class should be the class which has @Configuation. If you have more than one configuration files then use

classes = {ApplicationConfig.class,ApplicationConfig1.class }

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(loader = AnnotationConfigContextLoader.class, classes = ApplicationConfig.class)
public class BeerDaoImplTest extends AbstractJUnit4SpringContextTests {

            private BeerDao beerDao;

            @Autowired
            protected void setBeerDao(BeerDao beerDao) {
                        this.beerDao = beerDao;
            }

           
}

Spring IOC without xml


Since Java adopted annotations, most of application has started using it to cut down the xml configuration, spring 3 also introduce the annotation driven configuration, let’s explore.
Here is the Martin Fowler’s famous dependency injection example of moviLister and moveFinder

MovieFinder

public interface MovieFinder {
    List findAll();
}

Implementation of MovieFinder
public class ColonMovieFinder implements MovieFinder {
            Resource filename;
           
            public Resource getFilename() {
                        return filename;
            }

           
            public void setFilename(Resource filename) {
                        this.filename = filename;
            }

            @Override
            public List<Movie> findAll() {
                        List<Movie> movies=null;
                        try {
                                    movies = inputStreamAsString(this.filename.getInputStream());
                        } catch (IOException e) {
                                    // TODO Auto-generated catch block
                                    e.printStackTrace();
                        }
                        return movies;
            }
           
}      

MovieLister needs MovieFinder dependency

public class MovieLister {

            private MovieFinder finder;

            public void setFinder(MovieFinder finder) {
                        this.finder = finder;
            }

           
           
            public Movie[] moviesDirectedBy(String arg) {
                        List allMovies = finder.findAll();
                        for (Iterator it = allMovies.iterator(); it.hasNext();) {
                                    Movie movie = (Movie) it.next();
                                    if (!movie.getDirector().equals(arg))
                                                it.remove();
                        }
                        return (Movie[]) allMovies.toArray(new Movie[allMovies.size()]);
            }
           
           
}

Setup the dependency
1.       Configuration java class need to be written to define the dependency.
2.       Add @Configuration to tell the spring about your configuration class
3.       @Bean should be define to tell the spring, they are spring bean
4.       initMethod and distroyMethod attribute should be used in @Bean annotation to define the life cycle.
5.       @Scope should be used to define the scope is singleton or prototype or request ect

@Configuration
public class AppConfig {

            public
            @Bean(destroyMethod="distroy",initMethod="init")
             @Scope("prototype")
            MovieLister movieLister() {
                        MovieLister moviLister = new MovieLister();
                        moviLister.setFinder(movieFinder());
                        return moviLister;
            }

            public @Bean
            MovieFinder movieFinder() {
                        ColonMovieFinder colonMovieFinder = new ColonMovieFinder();
                        Resource resource = new ClassPathResource("examples/movies.txt");
                        colonMovieFinder.setFilename(resource);
                        return colonMovieFinder;
            }

}
Construct the application context
1.       Pass the class having @Configuration to AnnotationConfigApplicationContext constructor
2.       More than one configuration can be written and all of them can be passed to constructor as it support java 5 vargar


AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(AppConfig.class);

Test class
public class ApplicationContextAnnotationTest {

            public static void main(String[] args) {
        //Initialize IoC Container
                        AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(AppConfig.class);

                        //Retrieve the bean from Container
                        MovieLister myBean = context.getBean(MovieLister.class);

                        for(Movie m : myBean.moviesDirectedBy("Rajnikant")){
                                    System.out.println(m); 
                        }
                        context.destroy();
                        context.close();
            }
           
}

Problems what if you two bean of same type?
1.       Pass the name to @bean attribute
2.       Get the bean from context using been class and name.

Register bean by name
            public @Bean(name="moviFinder")
            MovieFinder movieFinder() {
                        ColonMovieFinder colonMovieFinder = new ColonMovieFinder();
                        Resource resource = new ClassPathResource("examples/movies.txt");
                        colonMovieFinder.setFilename(resource);
                        return colonMovieFinder;
            }

Retrieve bean by name
            AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(AppConfig.class);
           
MovieFinder myFind = context.getBean("moviFinder",MovieFinder.class);


Problem : this approach doesn’t cut the configuration it just move the configuration from xml file to java file, right way to cut the configuration is autowiring that I will post in next blog