Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Mellow_
Sep 13, 2010

:frog:
There's http://www.syncfusion.com but they have a similar price tag as Kendo Ui.

Adbot
ADBOT LOVES YOU

raminasi
Jan 25, 2005

a last drink with no ice
Apparently the stupid WPF bug that prevents you from binding OneWayToSource to read-only properties still exists in .NET 4.5.1. Does anybody know if it's in 4.6?

NihilCredo
Jun 6, 2011

iram omni possibili modo preme:
plus una illa te diffamabit, quam multæ virtutes commendabunt

AuxPriest posted:

There's http://www.syncfusion.com but they have a similar price tag as Kendo Ui.

Thanks, but what makes you say that SyncFusion works with WebSharper? I can't find any mention of either on the other's page

The price tag for the UI library I don't mind - my boss has already committed to buying something nice in that area. But a WebSharper wrapper (premium extension) that costs as much as a DevCraft license would be a much harder sell, when there is no such additional cost for MVC; I could probably make the case for purchasing a WebSharper commercial license instead of a UI library, but not on top of one.

Drythe
Aug 26, 2012


 
I don't know if anyone has experience using Salesforce or their SOAP API, but here is a situation I'm unsure if I understand:

Salesforce in their API for integration has a class called sforceservice that establishes a connection to the Salesforce environment. Since our application they only want to use the one user account to do the work and want to minimize the amount of times that login gets called (since we get a limited amount per day), they want to make it a static class for the web app. As I understand it this will make it the same instance across all sessions of this web app, so it only needs to login once every two hours for all users.

However the question of thread safety is now a concern of mine. As it is right now, the SforceService is stored in a public static class that only contains it with a private setter, public getter, and a method to login if it is timed out. Another class (We'll call it HelperA) gets created and stored in the HttpContext.Current.Items. It then references SforceService to pass the queries (which are formatted similar to SQL queries) to the class, such as

code:
string query = ("select name from contact");
SforceService.query(query);
Since the SforceService is a shared variable in a static class it's going to be taking these queries from every request. So even though the Helper methods where the queries are generated are stored in the context, won't this still cause thread safety issues? I'm not sure I understand this very well.

crashdome
Jun 28, 2011
You might have to implement a token-based web API to differentiate connections with a generic Login/Logout/Timeout and have that do all the calls to a back-end service which remains logged in at all times to Salesforce. Sounds like a nightmare to be honest. Why is there a daily limit to logins for an account which does a butt-load of work?

edit: Ack, nevermind. That actually doesn't really help you.

crashdome fucked around with this message at 20:01 on Jun 9, 2015

Drythe
Aug 26, 2012


 

crashdome posted:

You might have to implement a token-based web API to differentiate connections with a generic Login/Logout/Timeout and have that do all the calls to a back-end service which remains logged in at all times to Salesforce. Sounds like a nightmare to be honest. Why is there a daily limit to logins for an account which does a butt-load of work?

gently caress if I know, Salesforce is dumb? It's a general API call limit, and logging in counts as an API call.

I would prefer to not use this but I'm not the one making the calls.

aBagorn
Aug 26, 2004
Ok I hope I'm wording this right, but I want to figure out how to do something, and I'm really dumb when it comes to async/await stuff in general.

Currently we are processing csv files (that could contain upwards of 5 million rows) and populating the results into objects that will eventually live in the database. The methods are pretty convoluted and involve multiple steps, which all have to be done in order (for now). Some samples of the code I've inherited below. I feel like it belongs in the coding horrors thread.

C# code:
public bool ProcessInputFile(string inputFilePath)
{
    using (Stream stream = File.Open(inputFilePath, FileMode.Open, FileAccess.Read, FileShare.None))
	{
		var rawDataList = GetRawDataList(stream);
		var rawDataRowList = ProcessRawDataRowList(rawDataList);
		
		List<dbObject> objectList = new List<dbObject>();
		
		var count = 0;
		
		using (var _context = new ApplicationContext)
		{
			foreach (var rawDataRow in rawDataRowList)
			{
				//create new dbObject 
				var newDbObject = CreateNewDbObjectFromRawDataRow(rawDataRow);
				var dupeCount = 0;
				
				//here be logic here to check against the database to see if this is a duplicate
				
				if (dupeCount == 0)
				{
					objectList.Add(newDbObject);
					count++;
				}
				else
				{
					//logic here that writes to a logger that this record was flagged as a duplicate
					dupeCount = 0;
				}
				
				if (count > 1 && count % 30000 == 0)
				{
					_context.dbObjects.AddRange(objectList);
					_context.SaveChanges()
					objectList.Clear();
				}
			}			
			if (objectList.Count > 0)
			{
				_context.dbObjects.AddRange(objectList)
			}
		}
	}
}

private List<RawData> GetRawDataList(Stream stream)
{
	//logic here to turn each line into a RawDataObject (contains data as string and row number from csv file)
	return rawDataList; 
}

private List<RawDataRow> ProcessRawDataRowList(List<RawDataRow> rawDataList)
{
	List<RawDataRow> rawDataRowList = new List<RawDataRow>();
	foreach (var rawData in rawDataList)
	{
		//split rawData on '|' delimeter and validate each field
		var rawDataRow = ConstructRawDataRowObject(rawData.data);
		rawDataRowList.Add(rawDataRow);
	}
	
	return rawDataRow;
}

private dbObject CreateNewDbObjectFromRawDataRow(RawDataRow row)
{
	//logic here to transform RawDataRow to dbObject	
}
So the process basically flows file -> list of rawDataObjects -> foreach loop to make list of rawDataRow objects -> foreach loop to transform rawDataRow to dbObjects and save them via EF in batches.

I don't have too much leeway to completely gut everything (i.e., we save those rawData objects to their own table at one point and FK the dbObjects to them) so I can't really skip any of the steps.

What I'd like to do, however, is potentially run the first foreach loop until I hit some arbitrary number of rawDataRow objects (say 50k) and immediately kickoff the foreach(var rawDataRow in rawDataRowList) loop with that set while the next 50k rawDatas get transformed into rawDataRows.

This should be possible, right?

ljw1004
Jan 18, 2005

rum

aBagorn posted:

What I'd like to do, however, is potentially run the first foreach loop until I hit some arbitrary number of rawDataRow objects (say 50k) and immediately kickoff the foreach(var rawDataRow in rawDataRowList) loop with that set while the next 50k rawDatas get transformed into rawDataRows.

Just as a sanity check -- why?

I assume you're running some kind of nightly batch processor that does this work? And you have more CPU/RAM on your processing machine to throw at the problem? Are you looking to have the overall job to COMPLETE in a shorter time? Or are you looking to have partial results available immediately for some reason?

(I ask because async/await is normally unrelated to these kinds of problems, and it's important to know exactly what+why you're doing...)

aBagorn
Aug 26, 2004

ljw1004 posted:

Just as a sanity check -- why?

I assume you're running some kind of nightly batch processor that does this work? And you have more CPU/RAM on your processing machine to throw at the problem? Are you looking to have the overall job to COMPLETE in a shorter time? Or are you looking to have partial results available immediately for some reason?

(I ask because async/await is normally unrelated to these kinds of problems, and it's important to know exactly what+why you're doing...)

I'm looking to have the overall job to complete in a shorter time. We're running this in Azure so shorter processing times = less $$$

(I figured I didn't know what I was talking about)

As far as resources go, I cranked up the VM this service was running on to a D13 (8 cores 56GB RAM) and we're 3 days into processing a file with 5 million records, which is unacceptable to the client.

I have another strong feeling that reaching out to the DB (which lives on another server) during the foreach loops is also potentially a root cause of the problem, as well as improper management of the EF Context

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
5 million records isn't much but how you do it can make a big difference. My first suggestion would be to look into SQL Server Integration Services (SSIS) which is Microsoft's ETL (Extract-Transform-Load) software. It's meant to do this kind of work. When I was doing this kind of work 5 million records would probably be 10-15 minutes if the SSIS package was designed correctly.

My next suggestion is can you increase your data locality by caching the database in memory on the server? Going across the network is going to kill your performance especially if you are reaching across the Internet over a VPN. Even a 5ms round trip call (SQL Server on a LAN) for 5 million records is 6.9 hours of waiting on your network. Not counting the call to insert everything (another 6.9 hours). I haven't done much EF but it doesn't seem to be built to load millions of records.

aBagorn
Aug 26, 2004

gariig posted:

I haven't done much EF but it doesn't seem to be built to load millions of records.

It's not. I'd love to decouple this all from EF but that would require gutting the service, which isn't an option right now.

I could probably use a raw SQL query to pull a selection of Rows that are likely candidates for duplicates at the beginning of the process and put them in memory.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

aBagorn posted:

Ok I hope I'm wording this right, but I want to figure out how to do something, and I'm really dumb when it comes to async/await stuff in general.

Currently we are processing csv files (that could contain upwards of 5 million rows) and populating the results into objects that will eventually live in the database. The methods are pretty convoluted and involve multiple steps, which all have to be done in order (for now). Some samples of the code I've inherited below. I feel like it belongs in the coding horrors thread.

C# code:
:words:
So the process basically flows file -> list of rawDataObjects -> foreach loop to make list of rawDataRow objects -> foreach loop to transform rawDataRow to dbObjects and save them via EF in batches.

I don't have too much leeway to completely gut everything (i.e., we save those rawData objects to their own table at one point and FK the dbObjects to them) so I can't really skip any of the steps.

What I'd like to do, however, is potentially run the first foreach loop until I hit some arbitrary number of rawDataRow objects (say 50k) and immediately kickoff the foreach(var rawDataRow in rawDataRowList) loop with that set while the next 50k rawDatas get transformed into rawDataRows.

This should be possible, right?

ugh

use a merge into, generate the csv row -> table row client side and just do fire that poo poo off in one query, it'll be one big ol req and you can even have it log dupes ("WHEN MATCHED THEN fart")


batching only makes sense if csvrow->dbobject is expensive so you can parallelize the work otherwise you are io bound anyway


this looks like it's batching something in 30k chunks already



aBagorn posted:

I'm looking to have the overall job to complete in a shorter time. We're running this in Azure so shorter processing times = less $$$

(I figured I didn't know what I was talking about)

As far as resources go, I cranked up the VM this service was running on to a D13 (8 cores 56GB RAM) and we're 3 days into processing a file with 5 million records, which is unacceptable to the client.

I have another strong feeling that reaching out to the DB (which lives on another server) during the foreach loops is also potentially a root cause of the problem, as well as improper management of the EF Context

jesus christ

Malcolm XML
Aug 8, 2009

I always knew it would end like this.
also you are materializing the stream into lists, so you have a shitload of rows in memory


you probs want to stream ienumerables and manipulate using LINQ

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

gariig posted:

5 million records isn't much but how you do it can make a big difference. My first suggestion would be to look into SQL Server Integration Services (SSIS) which is Microsoft's ETL (Extract-Transform-Load) software. It's meant to do this kind of work. When I was doing this kind of work 5 million records would probably be 10-15 minutes if the SSIS package was designed correctly.

My next suggestion is can you increase your data locality by caching the database in memory on the server? Going across the network is going to kill your performance especially if you are reaching across the Internet over a VPN. Even a 5ms round trip call (SQL Server on a LAN) for 5 million records is 6.9 hours of waiting on your network. Not counting the call to insert everything (another 6.9 hours). I haven't done much EF but it doesn't seem to be built to load millions of records.

if this is creating a new req for each row i can see how it's taking days (!!!) unless constructing a dbobject from a row is super duper expensive

gariig
Dec 31, 2004
Beaten into submission by my fiance
Pillbug
My napkin math was just a guess from looking at aBaragon's code. It looked like the code was doing a dupe check for each object. Even still it was just to show that even a normally cheap operation (a call to a database) done millions of times can be very significant. My 10-15 minute guess is when I was doing ETL at a previous job and we'd do this basic stuff where the whole operation was done in memory.

I always did ETL in SSIS but merge into also works. Usually I was doing just enough crazy parsing that I wanted more control over the process. Plus SSIS could do the parse file -> turn into row level data -> insert row level data into SQL Server.

I think aBaragon has a lot of GC but it's probably mostly gen0 and way overshadowed by the network calls. If you can I'd try a smaller set of data and use some performance analyzers, both .NET and SQL Server. If that's not possible I'd stop the network calls.

GoodCleanFun
Jan 28, 2004

aBagorn posted:

So the process basically flows file -> list of rawDataObjects -> foreach loop to make list of rawDataRow objects -> foreach loop to transform rawDataRow to dbObjects and save them via EF in batches.

If the slowness is at the DB level on insertion due to EF then you should just write your own inserts. SqlBulkCopy class is what you want to use.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

aBagorn posted:

I have another strong feeling that reaching out to the DB (which lives on another server) during the foreach loops is also potentially a root cause of the problem, as well as improper management of the EF Context

I 100% guarantee you that this is your problem. Find a way to look for duplicates that doesn't involve a repeated database call. I don't know what your duplicate checking logic is like, but there's a good chance it can be done outside of the DB. Or you can try to do it all inside the DB, just for the love of god don't split it across two machines.


aBagorn posted:

As far as resources go, I cranked up the VM this service was running on to a D13 (8 cores 56GB RAM) and we're 3 days into processing a file with 5 million records, which is unacceptable to the client.

This is probably a waste. There's a significant chance that you're latency bound, which means you're not burning CPU. Have you checked the load average on the VM?


aBagorn posted:

C# code:
    count++;
    ...
    if (count > 1 && count % 30000 == 0)

Also, just for giggles, isn't this just if (count == 30000)?

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Bognar posted:

I 100% guarantee you that this is your problem. Find a way to look for duplicates that doesn't involve a repeated database call. I don't know what your duplicate checking logic is like, but there's a good chance it can be done outside of the DB. Or you can try to do it all inside the DB, just for the love of god don't split it across two machines.


This is probably a waste. There's a significant chance that you're latency bound, which means you're not burning CPU. Have you checked the load average on the VM?


Also, just for giggles, isn't this just if (count == 30000)?

loading data into sql server with dupe detection can and should be done in 1-2 statements (create temp or CTE on a merge into and log matches)

god knows what's going on inside EF to generate the row tuple.

Faldoncow
Jun 29, 2007
Munchin' on some steak
For hosting ASP.Net Web API 2, I know I can use IIS and apparently I can have it self-host with OWIN. Is there any major advantages to IIS or issues with self-hosting with OWIN? This is on a Windows Embedded system, if that makes any difference.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Malcolm XML posted:

loading data into sql server with dupe detection can and should be done in 1-2 statements (create temp or CTE on a merge into and log matches)

I'm just giving them the benefit of the doubt that maybe there's some special dupe detection logic going on that needed to be done in code and not SQL. That's probably naive, but the code was omitted so I'm not going to make any assumptions.

Kekekela
Oct 28, 2004

Bognar posted:

Also, just for giggles, isn't this just if (count == 30000)?
or 60k, or 90k, or 120k, etc

wwb
Aug 17, 2004

Malcolm XML posted:

loading data into sql server with dupe detection can and should be done in 1-2 statements (create temp or CTE on a merge into and log matches)

god knows what's going on inside EF to generate the row tuple.

This. Or if you need fancier than you want to do in sql just use ADO.NET or dapper if you are too cool for raw sql.

Faldoncow posted:

For hosting ASP.Net Web API 2, I know I can use IIS and apparently I can have it self-host with OWIN. Is there any major advantages to IIS or issues with self-hosting with OWIN? This is on a Windows Embedded system, if that makes any difference.

Self-hosted probably makes sense there. IIS is great if you need public-facing, hardened, internet ready stuff and/or managability. But self-hosting is great for internal app APIs and IoT poo poo.

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

Kekekela posted:

or 60k, or 90k, or 120k, etc

Except count is incremented only once each time before the if statement, so it's only ever 30k.

Kekekela
Oct 28, 2004

Bognar posted:

Except count is incremented only once each time before the if statement, so it's only ever 30k.

I'm not getting what you mean, this is how I would understand it with non-relevant bits removed:

code:
foreach (var rawDataRow in rawDataRowList)
{
    var dupeCount = 0;
    
    //here be logic here to check against the database to see if this is a duplicate
    // i'd assume dupecount could be changed in the code that was removed, otherwise the following is always true
    if (dupeCount == 0)
    {
        count++; // /count gets incremented, this could happen every time through the loop so....
    }
    
    if (count > 1 && count % 30000 == 0)  // ....count could be anything 
    {
    }
}	

Kekekela fucked around with this message at 20:50 on Jun 10, 2015

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy
Wow, I thought count was being set to 0 in that if statement, but I guess I just imagined it.

aBagorn
Aug 26, 2004
So I think I'm going to make a recommendation to ditch EF if at all possible.

This service was written before I got here and dealt with files that contained at most a few thousand rows, and the fact that it was not designed to scale is showing.

The only problem I forsee is that the original dev did things like this with EF relationships to the dbObject before inserting.

C# code:


dbObject.dbObjectOwner = new dbObjectOwner(); 

dbObject.ListOfThingsRelated = ThingListCreatedBefore;


And inserts with foreign keys and multiple join tables for many to many relationships.

I started looking into BULK insert but it seems like it's going to be multiple steps, especially if we are getting away from inserting these fully hydrated EF objects

aBagorn fucked around with this message at 22:51 on Jun 10, 2015

Kekekela
Oct 28, 2004

aBagorn posted:

So I think I'm going to make a recommendation to ditch EF if at all possible.

Tell them that fellow CoC poster kekekela agrees with your recommendation.

ljw1004
Jan 18, 2005

rum

Kekekela posted:

"ditch EF"
Tell them that fellow CoC poster kekekela agrees with your recommendation.

I like this post: "ORM is the Vietnam of computer science"
http://blog.codinghorror.com/object-relational-mapping-is-the-vietnam-of-computer-science/

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

aBagorn posted:

So I think I'm going to make a recommendation to ditch EF if at all possible.

This service was written before I got here and dealt with files that contained at most a few thousand rows, and the fact that it was not designed to scale is showing.

The only problem I forsee is that the original dev did things like this with EF relationships to the dbObject before inserting.

C# code:

dbObject.dbObjectOwner = new dbObjectOwner(); 

dbObject.ListOfThingsRelated = ThingListCreatedBefore;

And inserts with foreign keys and multiple join tables for many to many relationships.

I started looking into BULK insert but it seems like it's going to be multiple steps, especially if we are getting away from inserting these fully hydrated EF objects

as much as i dislike ORMs i dont think EF is your (only) issue. at worst you can have EF act as a object->sql statement creator

but that isn't all that slow even if you have each object insert in its own transaction (5mil transactions is trivial)

clearly something is deeply hosed in your validation and creation of the object. fix that first.

RICHUNCLEPENNYBAGS
Dec 21, 2010
If EF is a bottleneck in a handful of workflows why not use raw SQL there and keep it where it's not an issue?

Bulk inserts and deletes are one thing that EF does pretty poorly but there are optimization tricks that might be good enough (turn off auto change-tracking which literally descends the entire object graph every time an object is changed, commit every 100 records or so, etc).

Kekekela
Oct 28, 2004
I didn't mean to come off as anti-EF, just with the issues he's having and the size of the data I think its the wrong tool for the job here.

e: that article makes me wonder if I'm going to end up liking NoSql db's more than I'd initially thought, I'm pretty sure I'm about to get my first real-world exposure to them on an upcoming side-project.

Kekekela fucked around with this message at 03:50 on Jun 11, 2015

RICHUNCLEPENNYBAGS
Dec 21, 2010

Kekekela posted:

I didn't mean to come off as anti-EF, just with the issues he's having and the size of the data I think its the wrong tool for the job here.

e: that article makes me wonder if I'm going to end up liking NoSql db's more than I'd initially thought, I'm pretty sure I'm about to get my first real-world exposure to them on an upcoming side-project.

Something that describes everything from key-value stores to document databases to object databases is not really that useful a category. IMO.

Kekekela
Oct 28, 2004

RICHUNCLEPENNYBAGS posted:

Something that describes everything from key-value stores to document databases to object databases is not really that useful a category. IMO.
Sorry, document database in my case. I was really just looking for another way of saying "not a relational database".

Kekekela fucked around with this message at 03:57 on Jun 11, 2015

chippy
Aug 16, 2006

OK I DON'T GET IT

GoodCleanFun posted:

If the slowness is at the DB level on insertion due to EF then you should just write your own inserts. SqlBulkCopy class is what you want to use.

Just thought I'd mention that if you end up using SqlBulkCopy (which will save you a lot of time if you are doing a lot of inserts, I made some pretty good speed increases with it recently) and you're working with collections of objects you want to insert, Fast Member (by Marc Gravell of Dapper fame) makes it a little easier. Usually using SqlBulkCopy requires that you construct a datatable first, using FastMember you can basically wrap a collection of objects in a StreamReader and use that instead. It's worked pretty well for me.

https://code.google.com/p/fast-member/

uXs
May 3, 2005

Mark it zero!
Question about installing a vNext (or whatever its name is) website on IIS 7:

I have a site that works locally, both in IIS Express as in the full version, that I can't get to work on a server. It keeps giving me an "HTTP Error 403.14 - Forbidden
The Web server is configured to not list the contents of this directory." error. (I obviously don't want it to list the contents of the directory, I want it to give me my website.)

Other websites, made in previous versions of .Net (4.5) do work. I copy/pasted another website in the wwwroot of my new vNext website and that worked right away.

I installed .NET 4.6 on the server too, to no avail.

Any ideas?

edit: just discovered that vNext probably requires IIS 7.5. At least that's what my server told me when I enabled directory browsing and tried to browse to the Views folder.

uXs fucked around with this message at 16:44 on Jun 11, 2015

RICHUNCLEPENNYBAGS
Dec 21, 2010

chippy posted:

Just thought I'd mention that if you end up using SqlBulkCopy (which will save you a lot of time if you are doing a lot of inserts, I made some pretty good speed increases with it recently) and you're working with collections of objects you want to insert, Fast Member (by Marc Gravell of Dapper fame) makes it a little easier. Usually using SqlBulkCopy requires that you construct a datatable first, using FastMember you can basically wrap a collection of objects in a StreamReader and use that instead. It's worked pretty well for me.

https://code.google.com/p/fast-member/

That sounds pretty handy! Cool.

Sab669
Sep 24, 2009

edit; never mind. Finally sort-of fixed it and it's good enough for now.

Sab669 fucked around with this message at 21:03 on Jun 11, 2015

Bognar
Aug 4, 2011

I am the queen of France
Hot Rope Guy

uXs posted:

Question about installing a vNext (or whatever its name is) website on IIS 7:

I have a site that works locally, both in IIS Express as in the full version, that I can't get to work on a server. It keeps giving me an "HTTP Error 403.14 - Forbidden
The Web server is configured to not list the contents of this directory." error. (I obviously don't want it to list the contents of the directory, I want it to give me my website.)

Other websites, made in previous versions of .Net (4.5) do work. I copy/pasted another website in the wwwroot of my new vNext website and that worked right away.

I installed .NET 4.6 on the server too, to no avail.

Any ideas?

edit: just discovered that vNext probably requires IIS 7.5. At least that's what my server told me when I enabled directory browsing and tried to browse to the Views folder.

I'm not sure how you're deploying to IIS, but this could be helpful:

http://docs.asp.net/en/latest/publishing/iis.html

epswing
Nov 4, 2003

Soiled Meat
Cross posting from the Goons for Hire thread, if that's OK. Looking for an ASP.NET MVC/WebAPI dev!

http://forums.somethingawful.com/showthread.php?threadid=3246449&pagenumber=18#post446490530

Adbot
ADBOT LOVES YOU

brap
Aug 23, 2004

Grimey Drawer
Are there any trip reports out there of people's experience working on ASP.NET projects with Visual Studio Code?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply